Oct 08 18:48:51 localhost kernel: Linux version 5.14.0-620.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-11), GNU ld version 2.35.2-67.el9) #1 SMP PREEMPT_DYNAMIC Fri Sep 26 01:13:23 UTC 2025
Oct 08 18:48:51 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Oct 08 18:48:51 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-620.el9.x86_64 root=UUID=1631a6ad-43b8-436d-ae76-16fa14b94458 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Oct 08 18:48:51 localhost kernel: BIOS-provided physical RAM map:
Oct 08 18:48:51 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Oct 08 18:48:51 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Oct 08 18:48:51 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Oct 08 18:48:51 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Oct 08 18:48:51 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Oct 08 18:48:51 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Oct 08 18:48:51 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Oct 08 18:48:51 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Oct 08 18:48:51 localhost kernel: NX (Execute Disable) protection: active
Oct 08 18:48:51 localhost kernel: APIC: Static calls initialized
Oct 08 18:48:51 localhost kernel: SMBIOS 2.8 present.
Oct 08 18:48:51 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Oct 08 18:48:51 localhost kernel: Hypervisor detected: KVM
Oct 08 18:48:51 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Oct 08 18:48:51 localhost kernel: kvm-clock: using sched offset of 7394704554048 cycles
Oct 08 18:48:51 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Oct 08 18:48:51 localhost kernel: tsc: Detected 2800.000 MHz processor
Oct 08 18:48:51 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Oct 08 18:48:51 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Oct 08 18:48:51 localhost kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Oct 08 18:48:51 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Oct 08 18:48:51 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Oct 08 18:48:51 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Oct 08 18:48:51 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Oct 08 18:48:51 localhost kernel: Using GB pages for direct mapping
Oct 08 18:48:51 localhost kernel: RAMDISK: [mem 0x2d7c4000-0x32bd9fff]
Oct 08 18:48:51 localhost kernel: ACPI: Early table checksum verification disabled
Oct 08 18:48:51 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Oct 08 18:48:51 localhost kernel: ACPI: RSDT 0x00000000BFFE16C4 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct 08 18:48:51 localhost kernel: ACPI: FACP 0x00000000BFFE1578 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct 08 18:48:51 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F8 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct 08 18:48:51 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Oct 08 18:48:51 localhost kernel: ACPI: APIC 0x00000000BFFE15EC 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct 08 18:48:51 localhost kernel: ACPI: WAET 0x00000000BFFE169C 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct 08 18:48:51 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1578-0xbffe15eb]
Oct 08 18:48:51 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1577]
Oct 08 18:48:51 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Oct 08 18:48:51 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15ec-0xbffe169b]
Oct 08 18:48:51 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe169c-0xbffe16c3]
Oct 08 18:48:51 localhost kernel: No NUMA configuration found
Oct 08 18:48:51 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Oct 08 18:48:51 localhost kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Oct 08 18:48:51 localhost kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Oct 08 18:48:51 localhost kernel: Zone ranges:
Oct 08 18:48:51 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Oct 08 18:48:51 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Oct 08 18:48:51 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000023fffffff]
Oct 08 18:48:51 localhost kernel:   Device   empty
Oct 08 18:48:51 localhost kernel: Movable zone start for each node
Oct 08 18:48:51 localhost kernel: Early memory node ranges
Oct 08 18:48:51 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Oct 08 18:48:51 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Oct 08 18:48:51 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000023fffffff]
Oct 08 18:48:51 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Oct 08 18:48:51 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Oct 08 18:48:51 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Oct 08 18:48:51 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Oct 08 18:48:51 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Oct 08 18:48:51 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Oct 08 18:48:51 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Oct 08 18:48:51 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Oct 08 18:48:51 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Oct 08 18:48:51 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Oct 08 18:48:51 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Oct 08 18:48:51 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Oct 08 18:48:51 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Oct 08 18:48:51 localhost kernel: TSC deadline timer available
Oct 08 18:48:51 localhost kernel: CPU topo: Max. logical packages:   8
Oct 08 18:48:51 localhost kernel: CPU topo: Max. logical dies:       8
Oct 08 18:48:51 localhost kernel: CPU topo: Max. dies per package:   1
Oct 08 18:48:51 localhost kernel: CPU topo: Max. threads per core:   1
Oct 08 18:48:51 localhost kernel: CPU topo: Num. cores per package:     1
Oct 08 18:48:51 localhost kernel: CPU topo: Num. threads per package:   1
Oct 08 18:48:51 localhost kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Oct 08 18:48:51 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Oct 08 18:48:51 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Oct 08 18:48:51 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Oct 08 18:48:51 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Oct 08 18:48:51 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Oct 08 18:48:51 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Oct 08 18:48:51 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Oct 08 18:48:51 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Oct 08 18:48:51 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Oct 08 18:48:51 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Oct 08 18:48:51 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Oct 08 18:48:51 localhost kernel: Booting paravirtualized kernel on KVM
Oct 08 18:48:51 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Oct 08 18:48:51 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Oct 08 18:48:51 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Oct 08 18:48:51 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u262144 alloc=1*2097152
Oct 08 18:48:51 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Oct 08 18:48:51 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Oct 08 18:48:51 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-620.el9.x86_64 root=UUID=1631a6ad-43b8-436d-ae76-16fa14b94458 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Oct 08 18:48:51 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-620.el9.x86_64", will be passed to user space.
Oct 08 18:48:51 localhost kernel: random: crng init done
Oct 08 18:48:51 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Oct 08 18:48:51 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Oct 08 18:48:51 localhost kernel: Fallback order for Node 0: 0 
Oct 08 18:48:51 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Oct 08 18:48:51 localhost kernel: Policy zone: Normal
Oct 08 18:48:51 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Oct 08 18:48:51 localhost kernel: software IO TLB: area num 8.
Oct 08 18:48:51 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Oct 08 18:48:51 localhost kernel: ftrace: allocating 49370 entries in 193 pages
Oct 08 18:48:51 localhost kernel: ftrace: allocated 193 pages with 3 groups
Oct 08 18:48:51 localhost kernel: Dynamic Preempt: voluntary
Oct 08 18:48:51 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Oct 08 18:48:51 localhost kernel: rcu:         RCU event tracing is enabled.
Oct 08 18:48:51 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Oct 08 18:48:51 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Oct 08 18:48:51 localhost kernel:         Rude variant of Tasks RCU enabled.
Oct 08 18:48:51 localhost kernel:         Tracing variant of Tasks RCU enabled.
Oct 08 18:48:51 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Oct 08 18:48:51 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Oct 08 18:48:51 localhost kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Oct 08 18:48:51 localhost kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Oct 08 18:48:51 localhost kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Oct 08 18:48:51 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Oct 08 18:48:51 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Oct 08 18:48:51 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Oct 08 18:48:51 localhost kernel: Console: colour VGA+ 80x25
Oct 08 18:48:51 localhost kernel: printk: console [ttyS0] enabled
Oct 08 18:48:51 localhost kernel: ACPI: Core revision 20230331
Oct 08 18:48:51 localhost kernel: APIC: Switch to symmetric I/O mode setup
Oct 08 18:48:51 localhost kernel: x2apic enabled
Oct 08 18:48:51 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Oct 08 18:48:51 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Oct 08 18:48:51 localhost kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Oct 08 18:48:51 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Oct 08 18:48:51 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Oct 08 18:48:51 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Oct 08 18:48:51 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Oct 08 18:48:51 localhost kernel: Spectre V2 : Mitigation: Retpolines
Oct 08 18:48:51 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Oct 08 18:48:51 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Oct 08 18:48:51 localhost kernel: RETBleed: Mitigation: untrained return thunk
Oct 08 18:48:51 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Oct 08 18:48:51 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Oct 08 18:48:51 localhost kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Oct 08 18:48:51 localhost kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Oct 08 18:48:51 localhost kernel: x86/bugs: return thunk changed
Oct 08 18:48:51 localhost kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Oct 08 18:48:51 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Oct 08 18:48:51 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Oct 08 18:48:51 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Oct 08 18:48:51 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Oct 08 18:48:51 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Oct 08 18:48:51 localhost kernel: Freeing SMP alternatives memory: 40K
Oct 08 18:48:51 localhost kernel: pid_max: default: 32768 minimum: 301
Oct 08 18:48:51 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Oct 08 18:48:51 localhost kernel: landlock: Up and running.
Oct 08 18:48:51 localhost kernel: Yama: becoming mindful.
Oct 08 18:48:51 localhost kernel: SELinux:  Initializing.
Oct 08 18:48:51 localhost kernel: LSM support for eBPF active
Oct 08 18:48:51 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Oct 08 18:48:51 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Oct 08 18:48:51 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Oct 08 18:48:51 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Oct 08 18:48:51 localhost kernel: ... version:                0
Oct 08 18:48:51 localhost kernel: ... bit width:              48
Oct 08 18:48:51 localhost kernel: ... generic registers:      6
Oct 08 18:48:51 localhost kernel: ... value mask:             0000ffffffffffff
Oct 08 18:48:51 localhost kernel: ... max period:             00007fffffffffff
Oct 08 18:48:51 localhost kernel: ... fixed-purpose events:   0
Oct 08 18:48:51 localhost kernel: ... event mask:             000000000000003f
Oct 08 18:48:51 localhost kernel: signal: max sigframe size: 1776
Oct 08 18:48:51 localhost kernel: rcu: Hierarchical SRCU implementation.
Oct 08 18:48:51 localhost kernel: rcu:         Max phase no-delay instances is 400.
Oct 08 18:48:51 localhost kernel: smp: Bringing up secondary CPUs ...
Oct 08 18:48:51 localhost kernel: smpboot: x86: Booting SMP configuration:
Oct 08 18:48:51 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Oct 08 18:48:51 localhost kernel: smp: Brought up 1 node, 8 CPUs
Oct 08 18:48:51 localhost kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Oct 08 18:48:51 localhost kernel: node 0 deferred pages initialised in 32ms
Oct 08 18:48:51 localhost kernel: Memory: 7765660K/8388068K available (16384K kernel code, 5784K rwdata, 13996K rodata, 4068K init, 7304K bss, 616504K reserved, 0K cma-reserved)
Oct 08 18:48:51 localhost kernel: devtmpfs: initialized
Oct 08 18:48:51 localhost kernel: x86/mm: Memory block size: 128MB
Oct 08 18:48:51 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Oct 08 18:48:51 localhost kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Oct 08 18:48:51 localhost kernel: pinctrl core: initialized pinctrl subsystem
Oct 08 18:48:51 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Oct 08 18:48:51 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Oct 08 18:48:51 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Oct 08 18:48:51 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Oct 08 18:48:51 localhost kernel: audit: initializing netlink subsys (disabled)
Oct 08 18:48:51 localhost kernel: audit: type=2000 audit(1759949329.395:1): state=initialized audit_enabled=0 res=1
Oct 08 18:48:51 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Oct 08 18:48:51 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Oct 08 18:48:51 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Oct 08 18:48:51 localhost kernel: cpuidle: using governor menu
Oct 08 18:48:51 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Oct 08 18:48:51 localhost kernel: PCI: Using configuration type 1 for base access
Oct 08 18:48:51 localhost kernel: PCI: Using configuration type 1 for extended access
Oct 08 18:48:51 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Oct 08 18:48:51 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Oct 08 18:48:51 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Oct 08 18:48:51 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Oct 08 18:48:51 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Oct 08 18:48:51 localhost kernel: Demotion targets for Node 0: null
Oct 08 18:48:51 localhost kernel: cryptd: max_cpu_qlen set to 1000
Oct 08 18:48:51 localhost kernel: ACPI: Added _OSI(Module Device)
Oct 08 18:48:51 localhost kernel: ACPI: Added _OSI(Processor Device)
Oct 08 18:48:51 localhost kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Oct 08 18:48:51 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Oct 08 18:48:51 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Oct 08 18:48:51 localhost kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Oct 08 18:48:51 localhost kernel: ACPI: Interpreter enabled
Oct 08 18:48:51 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Oct 08 18:48:51 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Oct 08 18:48:51 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Oct 08 18:48:51 localhost kernel: PCI: Using E820 reservations for host bridge windows
Oct 08 18:48:51 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Oct 08 18:48:51 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Oct 08 18:48:51 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Oct 08 18:48:51 localhost kernel: acpiphp: Slot [3] registered
Oct 08 18:48:51 localhost kernel: acpiphp: Slot [4] registered
Oct 08 18:48:51 localhost kernel: acpiphp: Slot [5] registered
Oct 08 18:48:51 localhost kernel: acpiphp: Slot [6] registered
Oct 08 18:48:51 localhost kernel: acpiphp: Slot [7] registered
Oct 08 18:48:51 localhost kernel: acpiphp: Slot [8] registered
Oct 08 18:48:51 localhost kernel: acpiphp: Slot [9] registered
Oct 08 18:48:51 localhost kernel: acpiphp: Slot [10] registered
Oct 08 18:48:51 localhost kernel: acpiphp: Slot [11] registered
Oct 08 18:48:51 localhost kernel: acpiphp: Slot [12] registered
Oct 08 18:48:51 localhost kernel: acpiphp: Slot [13] registered
Oct 08 18:48:51 localhost kernel: acpiphp: Slot [14] registered
Oct 08 18:48:51 localhost kernel: acpiphp: Slot [15] registered
Oct 08 18:48:51 localhost kernel: acpiphp: Slot [16] registered
Oct 08 18:48:51 localhost kernel: acpiphp: Slot [17] registered
Oct 08 18:48:51 localhost kernel: acpiphp: Slot [18] registered
Oct 08 18:48:51 localhost kernel: acpiphp: Slot [19] registered
Oct 08 18:48:51 localhost kernel: acpiphp: Slot [20] registered
Oct 08 18:48:51 localhost kernel: acpiphp: Slot [21] registered
Oct 08 18:48:51 localhost kernel: acpiphp: Slot [22] registered
Oct 08 18:48:51 localhost kernel: acpiphp: Slot [23] registered
Oct 08 18:48:51 localhost kernel: acpiphp: Slot [24] registered
Oct 08 18:48:51 localhost kernel: acpiphp: Slot [25] registered
Oct 08 18:48:51 localhost kernel: acpiphp: Slot [26] registered
Oct 08 18:48:51 localhost kernel: acpiphp: Slot [27] registered
Oct 08 18:48:51 localhost kernel: acpiphp: Slot [28] registered
Oct 08 18:48:51 localhost kernel: acpiphp: Slot [29] registered
Oct 08 18:48:51 localhost kernel: acpiphp: Slot [30] registered
Oct 08 18:48:51 localhost kernel: acpiphp: Slot [31] registered
Oct 08 18:48:51 localhost kernel: PCI host bridge to bus 0000:00
Oct 08 18:48:51 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Oct 08 18:48:51 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Oct 08 18:48:51 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Oct 08 18:48:51 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Oct 08 18:48:51 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Oct 08 18:48:51 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Oct 08 18:48:51 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Oct 08 18:48:51 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Oct 08 18:48:51 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Oct 08 18:48:51 localhost kernel: pci 0000:00:01.1: BAR 4 [io  0xc180-0xc18f]
Oct 08 18:48:51 localhost kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Oct 08 18:48:51 localhost kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Oct 08 18:48:51 localhost kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Oct 08 18:48:51 localhost kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Oct 08 18:48:51 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Oct 08 18:48:51 localhost kernel: pci 0000:00:01.2: BAR 4 [io  0xc140-0xc15f]
Oct 08 18:48:51 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Oct 08 18:48:51 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Oct 08 18:48:51 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Oct 08 18:48:51 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Oct 08 18:48:51 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Oct 08 18:48:51 localhost kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Oct 08 18:48:51 localhost kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Oct 08 18:48:51 localhost kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Oct 08 18:48:51 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Oct 08 18:48:51 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Oct 08 18:48:51 localhost kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Oct 08 18:48:51 localhost kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Oct 08 18:48:51 localhost kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Oct 08 18:48:51 localhost kernel: pci 0000:00:03.0: ROM [mem 0xfea80000-0xfeafffff pref]
Oct 08 18:48:51 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Oct 08 18:48:51 localhost kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Oct 08 18:48:51 localhost kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Oct 08 18:48:51 localhost kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Oct 08 18:48:51 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Oct 08 18:48:51 localhost kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Oct 08 18:48:51 localhost kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Oct 08 18:48:51 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Oct 08 18:48:51 localhost kernel: pci 0000:00:06.0: BAR 0 [io  0xc160-0xc17f]
Oct 08 18:48:51 localhost kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Oct 08 18:48:51 localhost kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Oct 08 18:48:51 localhost kernel: pci 0000:00:07.0: BAR 0 [io  0xc100-0xc13f]
Oct 08 18:48:51 localhost kernel: pci 0000:00:07.0: BAR 1 [mem 0xfeb93000-0xfeb93fff]
Oct 08 18:48:51 localhost kernel: pci 0000:00:07.0: BAR 4 [mem 0xfe814000-0xfe817fff 64bit pref]
Oct 08 18:48:51 localhost kernel: pci 0000:00:07.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Oct 08 18:48:51 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Oct 08 18:48:51 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Oct 08 18:48:51 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Oct 08 18:48:51 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Oct 08 18:48:51 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Oct 08 18:48:51 localhost kernel: iommu: Default domain type: Translated
Oct 08 18:48:51 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Oct 08 18:48:51 localhost kernel: SCSI subsystem initialized
Oct 08 18:48:51 localhost kernel: ACPI: bus type USB registered
Oct 08 18:48:51 localhost kernel: usbcore: registered new interface driver usbfs
Oct 08 18:48:51 localhost kernel: usbcore: registered new interface driver hub
Oct 08 18:48:51 localhost kernel: usbcore: registered new device driver usb
Oct 08 18:48:51 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Oct 08 18:48:51 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Oct 08 18:48:51 localhost kernel: PTP clock support registered
Oct 08 18:48:51 localhost kernel: EDAC MC: Ver: 3.0.0
Oct 08 18:48:51 localhost kernel: NetLabel: Initializing
Oct 08 18:48:51 localhost kernel: NetLabel:  domain hash size = 128
Oct 08 18:48:51 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Oct 08 18:48:51 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Oct 08 18:48:51 localhost kernel: PCI: Using ACPI for IRQ routing
Oct 08 18:48:51 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Oct 08 18:48:51 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Oct 08 18:48:51 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Oct 08 18:48:51 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Oct 08 18:48:51 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Oct 08 18:48:51 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Oct 08 18:48:51 localhost kernel: vgaarb: loaded
Oct 08 18:48:51 localhost kernel: clocksource: Switched to clocksource kvm-clock
Oct 08 18:48:51 localhost kernel: VFS: Disk quotas dquot_6.6.0
Oct 08 18:48:51 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Oct 08 18:48:51 localhost kernel: pnp: PnP ACPI init
Oct 08 18:48:51 localhost kernel: pnp 00:03: [dma 2]
Oct 08 18:48:51 localhost kernel: pnp: PnP ACPI: found 5 devices
Oct 08 18:48:51 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Oct 08 18:48:51 localhost kernel: NET: Registered PF_INET protocol family
Oct 08 18:48:51 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Oct 08 18:48:51 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Oct 08 18:48:51 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Oct 08 18:48:51 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Oct 08 18:48:51 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Oct 08 18:48:51 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Oct 08 18:48:51 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Oct 08 18:48:51 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Oct 08 18:48:51 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Oct 08 18:48:51 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Oct 08 18:48:51 localhost kernel: NET: Registered PF_XDP protocol family
Oct 08 18:48:51 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Oct 08 18:48:51 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Oct 08 18:48:51 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Oct 08 18:48:51 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Oct 08 18:48:51 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Oct 08 18:48:51 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Oct 08 18:48:51 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Oct 08 18:48:51 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Oct 08 18:48:51 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x140 took 72283 usecs
Oct 08 18:48:51 localhost kernel: PCI: CLS 0 bytes, default 64
Oct 08 18:48:51 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Oct 08 18:48:51 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Oct 08 18:48:51 localhost kernel: Trying to unpack rootfs image as initramfs...
Oct 08 18:48:51 localhost kernel: ACPI: bus type thunderbolt registered
Oct 08 18:48:51 localhost kernel: Initialise system trusted keyrings
Oct 08 18:48:51 localhost kernel: Key type blacklist registered
Oct 08 18:48:51 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Oct 08 18:48:51 localhost kernel: zbud: loaded
Oct 08 18:48:51 localhost kernel: integrity: Platform Keyring initialized
Oct 08 18:48:51 localhost kernel: integrity: Machine keyring initialized
Oct 08 18:48:51 localhost kernel: Freeing initrd memory: 86104K
Oct 08 18:48:51 localhost kernel: NET: Registered PF_ALG protocol family
Oct 08 18:48:51 localhost kernel: xor: automatically using best checksumming function   avx       
Oct 08 18:48:51 localhost kernel: Key type asymmetric registered
Oct 08 18:48:51 localhost kernel: Asymmetric key parser 'x509' registered
Oct 08 18:48:51 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Oct 08 18:48:51 localhost kernel: io scheduler mq-deadline registered
Oct 08 18:48:51 localhost kernel: io scheduler kyber registered
Oct 08 18:48:51 localhost kernel: io scheduler bfq registered
Oct 08 18:48:51 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Oct 08 18:48:51 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Oct 08 18:48:51 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Oct 08 18:48:51 localhost kernel: ACPI: button: Power Button [PWRF]
Oct 08 18:48:51 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Oct 08 18:48:51 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Oct 08 18:48:51 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Oct 08 18:48:51 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Oct 08 18:48:51 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Oct 08 18:48:51 localhost kernel: Non-volatile memory driver v1.3
Oct 08 18:48:51 localhost kernel: rdac: device handler registered
Oct 08 18:48:51 localhost kernel: hp_sw: device handler registered
Oct 08 18:48:51 localhost kernel: emc: device handler registered
Oct 08 18:48:51 localhost kernel: alua: device handler registered
Oct 08 18:48:51 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Oct 08 18:48:51 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Oct 08 18:48:51 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Oct 08 18:48:51 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c140
Oct 08 18:48:51 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Oct 08 18:48:51 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Oct 08 18:48:51 localhost kernel: usb usb1: Product: UHCI Host Controller
Oct 08 18:48:51 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-620.el9.x86_64 uhci_hcd
Oct 08 18:48:51 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Oct 08 18:48:51 localhost kernel: hub 1-0:1.0: USB hub found
Oct 08 18:48:51 localhost kernel: hub 1-0:1.0: 2 ports detected
Oct 08 18:48:51 localhost kernel: usbcore: registered new interface driver usbserial_generic
Oct 08 18:48:51 localhost kernel: usbserial: USB Serial support registered for generic
Oct 08 18:48:51 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Oct 08 18:48:51 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Oct 08 18:48:51 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Oct 08 18:48:51 localhost kernel: mousedev: PS/2 mouse device common for all mice
Oct 08 18:48:51 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Oct 08 18:48:51 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Oct 08 18:48:51 localhost kernel: rtc_cmos 00:04: registered as rtc0
Oct 08 18:48:51 localhost kernel: rtc_cmos 00:04: setting system clock to 2025-10-08T18:48:50 UTC (1759949330)
Oct 08 18:48:51 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Oct 08 18:48:51 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Oct 08 18:48:51 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Oct 08 18:48:51 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Oct 08 18:48:51 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Oct 08 18:48:51 localhost kernel: usbcore: registered new interface driver usbhid
Oct 08 18:48:51 localhost kernel: usbhid: USB HID core driver
Oct 08 18:48:51 localhost kernel: drop_monitor: Initializing network drop monitor service
Oct 08 18:48:51 localhost kernel: Initializing XFRM netlink socket
Oct 08 18:48:51 localhost kernel: NET: Registered PF_INET6 protocol family
Oct 08 18:48:51 localhost kernel: Segment Routing with IPv6
Oct 08 18:48:51 localhost kernel: NET: Registered PF_PACKET protocol family
Oct 08 18:48:51 localhost kernel: mpls_gso: MPLS GSO support
Oct 08 18:48:51 localhost kernel: IPI shorthand broadcast: enabled
Oct 08 18:48:51 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Oct 08 18:48:51 localhost kernel: AES CTR mode by8 optimization enabled
Oct 08 18:48:51 localhost kernel: sched_clock: Marking stable (1189003860, 145107150)->(1442805359, -108694349)
Oct 08 18:48:51 localhost kernel: registered taskstats version 1
Oct 08 18:48:51 localhost kernel: Loading compiled-in X.509 certificates
Oct 08 18:48:51 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4ff821c4997fbb659836adb05f5bc400c914e148'
Oct 08 18:48:51 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Oct 08 18:48:51 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Oct 08 18:48:51 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Oct 08 18:48:51 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Oct 08 18:48:51 localhost kernel: Demotion targets for Node 0: null
Oct 08 18:48:51 localhost kernel: page_owner is disabled
Oct 08 18:48:51 localhost kernel: Key type .fscrypt registered
Oct 08 18:48:51 localhost kernel: Key type fscrypt-provisioning registered
Oct 08 18:48:51 localhost kernel: Key type big_key registered
Oct 08 18:48:51 localhost kernel: Key type encrypted registered
Oct 08 18:48:51 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Oct 08 18:48:51 localhost kernel: Loading compiled-in module X.509 certificates
Oct 08 18:48:51 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4ff821c4997fbb659836adb05f5bc400c914e148'
Oct 08 18:48:51 localhost kernel: ima: Allocated hash algorithm: sha256
Oct 08 18:48:51 localhost kernel: ima: No architecture policies found
Oct 08 18:48:51 localhost kernel: evm: Initialising EVM extended attributes:
Oct 08 18:48:51 localhost kernel: evm: security.selinux
Oct 08 18:48:51 localhost kernel: evm: security.SMACK64 (disabled)
Oct 08 18:48:51 localhost kernel: evm: security.SMACK64EXEC (disabled)
Oct 08 18:48:51 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Oct 08 18:48:51 localhost kernel: evm: security.SMACK64MMAP (disabled)
Oct 08 18:48:51 localhost kernel: evm: security.apparmor (disabled)
Oct 08 18:48:51 localhost kernel: evm: security.ima
Oct 08 18:48:51 localhost kernel: evm: security.capability
Oct 08 18:48:51 localhost kernel: evm: HMAC attrs: 0x1
Oct 08 18:48:51 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Oct 08 18:48:51 localhost kernel: Running certificate verification RSA selftest
Oct 08 18:48:51 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Oct 08 18:48:51 localhost kernel: Running certificate verification ECDSA selftest
Oct 08 18:48:51 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Oct 08 18:48:51 localhost kernel: clk: Disabling unused clocks
Oct 08 18:48:51 localhost kernel: Freeing unused decrypted memory: 2028K
Oct 08 18:48:51 localhost kernel: Freeing unused kernel image (initmem) memory: 4068K
Oct 08 18:48:51 localhost kernel: Write protecting the kernel read-only data: 30720k
Oct 08 18:48:51 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Oct 08 18:48:51 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Oct 08 18:48:51 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Oct 08 18:48:51 localhost kernel: usb 1-1: Manufacturer: QEMU
Oct 08 18:48:51 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Oct 08 18:48:51 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 340K
Oct 08 18:48:51 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Oct 08 18:48:51 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Oct 08 18:48:51 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Oct 08 18:48:51 localhost kernel: Run /init as init process
Oct 08 18:48:51 localhost kernel:   with arguments:
Oct 08 18:48:51 localhost kernel:     /init
Oct 08 18:48:51 localhost kernel:   with environment:
Oct 08 18:48:51 localhost kernel:     HOME=/
Oct 08 18:48:51 localhost kernel:     TERM=linux
Oct 08 18:48:51 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-620.el9.x86_64
Oct 08 18:48:51 localhost systemd[1]: systemd 252-55.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Oct 08 18:48:51 localhost systemd[1]: Detected virtualization kvm.
Oct 08 18:48:51 localhost systemd[1]: Detected architecture x86-64.
Oct 08 18:48:51 localhost systemd[1]: Running in initrd.
Oct 08 18:48:51 localhost systemd[1]: No hostname configured, using default hostname.
Oct 08 18:48:51 localhost systemd[1]: Hostname set to <localhost>.
Oct 08 18:48:51 localhost systemd[1]: Initializing machine ID from VM UUID.
Oct 08 18:48:51 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Oct 08 18:48:51 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Oct 08 18:48:51 localhost systemd[1]: Reached target Local Encrypted Volumes.
Oct 08 18:48:51 localhost systemd[1]: Reached target Initrd /usr File System.
Oct 08 18:48:51 localhost systemd[1]: Reached target Local File Systems.
Oct 08 18:48:51 localhost systemd[1]: Reached target Path Units.
Oct 08 18:48:51 localhost systemd[1]: Reached target Slice Units.
Oct 08 18:48:51 localhost systemd[1]: Reached target Swaps.
Oct 08 18:48:51 localhost systemd[1]: Reached target Timer Units.
Oct 08 18:48:51 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Oct 08 18:48:51 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Oct 08 18:48:51 localhost systemd[1]: Listening on Journal Socket.
Oct 08 18:48:51 localhost systemd[1]: Listening on udev Control Socket.
Oct 08 18:48:51 localhost systemd[1]: Listening on udev Kernel Socket.
Oct 08 18:48:51 localhost systemd[1]: Reached target Socket Units.
Oct 08 18:48:51 localhost systemd[1]: Starting Create List of Static Device Nodes...
Oct 08 18:48:51 localhost systemd[1]: Starting Journal Service...
Oct 08 18:48:51 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Oct 08 18:48:51 localhost systemd[1]: Starting Apply Kernel Variables...
Oct 08 18:48:51 localhost systemd[1]: Starting Create System Users...
Oct 08 18:48:51 localhost systemd[1]: Starting Setup Virtual Console...
Oct 08 18:48:51 localhost systemd[1]: Finished Create List of Static Device Nodes.
Oct 08 18:48:51 localhost systemd[1]: Finished Apply Kernel Variables.
Oct 08 18:48:51 localhost systemd[1]: Finished Create System Users.
Oct 08 18:48:51 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Oct 08 18:48:51 localhost systemd-journald[305]: Journal started
Oct 08 18:48:51 localhost systemd-journald[305]: Runtime Journal (/run/log/journal/9ff32318d7e04b37bb6eea4cfd795672) is 8.0M, max 153.5M, 145.5M free.
Oct 08 18:48:51 localhost systemd-sysusers[308]: Creating group 'users' with GID 100.
Oct 08 18:48:51 localhost systemd-sysusers[308]: Creating group 'dbus' with GID 81.
Oct 08 18:48:51 localhost systemd-sysusers[308]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Oct 08 18:48:51 localhost systemd[1]: Started Journal Service.
Oct 08 18:48:51 localhost systemd[1]: Starting Create Volatile Files and Directories...
Oct 08 18:48:51 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Oct 08 18:48:51 localhost systemd[1]: Finished Create Volatile Files and Directories.
Oct 08 18:48:51 localhost systemd[1]: Finished Setup Virtual Console.
Oct 08 18:48:51 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Oct 08 18:48:51 localhost systemd[1]: Starting dracut cmdline hook...
Oct 08 18:48:51 localhost dracut-cmdline[327]: dracut-9 dracut-057-102.git20250818.el9
Oct 08 18:48:51 localhost dracut-cmdline[327]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-620.el9.x86_64 root=UUID=1631a6ad-43b8-436d-ae76-16fa14b94458 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Oct 08 18:48:51 localhost systemd[1]: Finished dracut cmdline hook.
Oct 08 18:48:51 localhost systemd[1]: Starting dracut pre-udev hook...
Oct 08 18:48:51 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Oct 08 18:48:51 localhost kernel: device-mapper: uevent: version 1.0.3
Oct 08 18:48:51 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Oct 08 18:48:51 localhost kernel: RPC: Registered named UNIX socket transport module.
Oct 08 18:48:51 localhost kernel: RPC: Registered udp transport module.
Oct 08 18:48:51 localhost kernel: RPC: Registered tcp transport module.
Oct 08 18:48:51 localhost kernel: RPC: Registered tcp-with-tls transport module.
Oct 08 18:48:51 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Oct 08 18:48:51 localhost rpc.statd[443]: Version 2.5.4 starting
Oct 08 18:48:51 localhost rpc.statd[443]: Initializing NSM state
Oct 08 18:48:52 localhost rpc.idmapd[448]: Setting log level to 0
Oct 08 18:48:52 localhost systemd[1]: Finished dracut pre-udev hook.
Oct 08 18:48:52 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Oct 08 18:48:52 localhost systemd-udevd[461]: Using default interface naming scheme 'rhel-9.0'.
Oct 08 18:48:52 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Oct 08 18:48:52 localhost systemd[1]: Starting dracut pre-trigger hook...
Oct 08 18:48:52 localhost systemd[1]: Finished dracut pre-trigger hook.
Oct 08 18:48:52 localhost systemd[1]: Starting Coldplug All udev Devices...
Oct 08 18:48:52 localhost systemd[1]: Created slice Slice /system/modprobe.
Oct 08 18:48:52 localhost systemd[1]: Starting Load Kernel Module configfs...
Oct 08 18:48:52 localhost systemd[1]: Finished Coldplug All udev Devices.
Oct 08 18:48:52 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Oct 08 18:48:52 localhost systemd[1]: Finished Load Kernel Module configfs.
Oct 08 18:48:52 localhost systemd[1]: Mounting Kernel Configuration File System...
Oct 08 18:48:52 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Oct 08 18:48:52 localhost systemd[1]: Reached target Network.
Oct 08 18:48:52 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Oct 08 18:48:52 localhost systemd[1]: Starting dracut initqueue hook...
Oct 08 18:48:52 localhost systemd[1]: Mounted Kernel Configuration File System.
Oct 08 18:48:52 localhost kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Oct 08 18:48:52 localhost systemd[1]: Reached target System Initialization.
Oct 08 18:48:52 localhost systemd[1]: Reached target Basic System.
Oct 08 18:48:52 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Oct 08 18:48:52 localhost kernel:  vda: vda1
Oct 08 18:48:52 localhost kernel: libata version 3.00 loaded.
Oct 08 18:48:52 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Oct 08 18:48:52 localhost kernel: scsi host0: ata_piix
Oct 08 18:48:52 localhost kernel: scsi host1: ata_piix
Oct 08 18:48:52 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc180 irq 14 lpm-pol 0
Oct 08 18:48:52 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc188 irq 15 lpm-pol 0
Oct 08 18:48:52 localhost systemd[1]: Found device /dev/disk/by-uuid/1631a6ad-43b8-436d-ae76-16fa14b94458.
Oct 08 18:48:52 localhost systemd[1]: Reached target Initrd Root Device.
Oct 08 18:48:52 localhost kernel: ata1: found unknown device (class 0)
Oct 08 18:48:52 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Oct 08 18:48:52 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Oct 08 18:48:52 localhost systemd-udevd[473]: Network interface NamePolicy= disabled on kernel command line.
Oct 08 18:48:52 localhost systemd-udevd[481]: Network interface NamePolicy= disabled on kernel command line.
Oct 08 18:48:52 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Oct 08 18:48:52 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Oct 08 18:48:52 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Oct 08 18:48:52 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Oct 08 18:48:52 localhost systemd[1]: Finished dracut initqueue hook.
Oct 08 18:48:52 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Oct 08 18:48:52 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Oct 08 18:48:52 localhost systemd[1]: Reached target Remote File Systems.
Oct 08 18:48:52 localhost systemd[1]: Starting dracut pre-mount hook...
Oct 08 18:48:52 localhost systemd[1]: Finished dracut pre-mount hook.
Oct 08 18:48:52 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/1631a6ad-43b8-436d-ae76-16fa14b94458...
Oct 08 18:48:52 localhost systemd-fsck[554]: /usr/sbin/fsck.xfs: XFS file system.
Oct 08 18:48:52 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/1631a6ad-43b8-436d-ae76-16fa14b94458.
Oct 08 18:48:52 localhost systemd[1]: Mounting /sysroot...
Oct 08 18:48:53 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Oct 08 18:48:53 localhost kernel: XFS (vda1): Mounting V5 Filesystem 1631a6ad-43b8-436d-ae76-16fa14b94458
Oct 08 18:48:53 localhost kernel: XFS (vda1): Ending clean mount
Oct 08 18:48:53 localhost systemd[1]: Mounted /sysroot.
Oct 08 18:48:53 localhost systemd[1]: Reached target Initrd Root File System.
Oct 08 18:48:53 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Oct 08 18:48:53 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Oct 08 18:48:53 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Oct 08 18:48:53 localhost systemd[1]: Reached target Initrd File Systems.
Oct 08 18:48:53 localhost systemd[1]: Reached target Initrd Default Target.
Oct 08 18:48:53 localhost systemd[1]: Starting dracut mount hook...
Oct 08 18:48:53 localhost systemd[1]: Finished dracut mount hook.
Oct 08 18:48:53 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Oct 08 18:48:53 localhost rpc.idmapd[448]: exiting on signal 15
Oct 08 18:48:53 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Oct 08 18:48:53 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Oct 08 18:48:53 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Oct 08 18:48:53 localhost systemd[1]: Stopped target Network.
Oct 08 18:48:53 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Oct 08 18:48:53 localhost systemd[1]: Stopped target Timer Units.
Oct 08 18:48:53 localhost systemd[1]: dbus.socket: Deactivated successfully.
Oct 08 18:48:53 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Oct 08 18:48:53 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Oct 08 18:48:53 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Oct 08 18:48:53 localhost systemd[1]: Stopped target Initrd Default Target.
Oct 08 18:48:53 localhost systemd[1]: Stopped target Basic System.
Oct 08 18:48:53 localhost systemd[1]: Stopped target Initrd Root Device.
Oct 08 18:48:53 localhost systemd[1]: Stopped target Initrd /usr File System.
Oct 08 18:48:53 localhost systemd[1]: Stopped target Path Units.
Oct 08 18:48:53 localhost systemd[1]: Stopped target Remote File Systems.
Oct 08 18:48:53 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Oct 08 18:48:53 localhost systemd[1]: Stopped target Slice Units.
Oct 08 18:48:53 localhost systemd[1]: Stopped target Socket Units.
Oct 08 18:48:53 localhost systemd[1]: Stopped target System Initialization.
Oct 08 18:48:53 localhost systemd[1]: Stopped target Local File Systems.
Oct 08 18:48:53 localhost systemd[1]: Stopped target Swaps.
Oct 08 18:48:53 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Oct 08 18:48:53 localhost systemd[1]: Stopped dracut mount hook.
Oct 08 18:48:53 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Oct 08 18:48:53 localhost systemd[1]: Stopped dracut pre-mount hook.
Oct 08 18:48:53 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Oct 08 18:48:53 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Oct 08 18:48:53 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Oct 08 18:48:53 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Oct 08 18:48:53 localhost systemd[1]: Stopped dracut initqueue hook.
Oct 08 18:48:53 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Oct 08 18:48:53 localhost systemd[1]: Stopped Apply Kernel Variables.
Oct 08 18:48:53 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Oct 08 18:48:53 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Oct 08 18:48:53 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Oct 08 18:48:53 localhost systemd[1]: Stopped Coldplug All udev Devices.
Oct 08 18:48:53 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Oct 08 18:48:53 localhost systemd[1]: Stopped dracut pre-trigger hook.
Oct 08 18:48:53 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Oct 08 18:48:53 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Oct 08 18:48:53 localhost systemd[1]: Stopped Setup Virtual Console.
Oct 08 18:48:53 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Oct 08 18:48:53 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Oct 08 18:48:53 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Oct 08 18:48:53 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Oct 08 18:48:53 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Oct 08 18:48:53 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Oct 08 18:48:53 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Oct 08 18:48:53 localhost systemd[1]: Closed udev Control Socket.
Oct 08 18:48:53 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Oct 08 18:48:53 localhost systemd[1]: Closed udev Kernel Socket.
Oct 08 18:48:53 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Oct 08 18:48:53 localhost systemd[1]: Stopped dracut pre-udev hook.
Oct 08 18:48:53 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Oct 08 18:48:53 localhost systemd[1]: Stopped dracut cmdline hook.
Oct 08 18:48:53 localhost systemd[1]: Starting Cleanup udev Database...
Oct 08 18:48:53 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Oct 08 18:48:53 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Oct 08 18:48:53 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Oct 08 18:48:53 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Oct 08 18:48:53 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Oct 08 18:48:53 localhost systemd[1]: Stopped Create System Users.
Oct 08 18:48:53 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Oct 08 18:48:53 localhost systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Oct 08 18:48:53 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Oct 08 18:48:53 localhost systemd[1]: Finished Cleanup udev Database.
Oct 08 18:48:53 localhost systemd[1]: Reached target Switch Root.
Oct 08 18:48:53 localhost systemd[1]: Starting Switch Root...
Oct 08 18:48:53 localhost systemd[1]: Switching root.
Oct 08 18:48:53 localhost systemd-journald[305]: Journal stopped
Oct 08 18:48:55 compute-0 systemd-journald[305]: Received SIGTERM from PID 1 (systemd).
Oct 08 18:48:55 compute-0 kernel: audit: type=1404 audit(1759949334.256:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Oct 08 18:48:55 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Oct 08 18:48:55 compute-0 kernel: SELinux:  policy capability open_perms=1
Oct 08 18:48:55 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Oct 08 18:48:55 compute-0 kernel: SELinux:  policy capability always_check_network=0
Oct 08 18:48:55 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 08 18:48:55 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 08 18:48:55 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 08 18:48:55 compute-0 kernel: audit: type=1403 audit(1759949334.413:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Oct 08 18:48:55 compute-0 systemd[1]: Successfully loaded SELinux policy in 161.369ms.
Oct 08 18:48:55 compute-0 systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 37.390ms.
Oct 08 18:48:55 compute-0 systemd[1]: systemd 252-57.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Oct 08 18:48:55 compute-0 systemd[1]: Detected virtualization kvm.
Oct 08 18:48:55 compute-0 systemd[1]: Detected architecture x86-64.
Oct 08 18:48:55 compute-0 systemd[1]: Hostname set to <compute-0>.
Oct 08 18:48:55 compute-0 systemd-rc-local-generator[639]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 18:48:55 compute-0 systemd-sysv-generator[642]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 18:48:55 compute-0 systemd[1]: initrd-switch-root.service: Deactivated successfully.
Oct 08 18:48:55 compute-0 systemd[1]: Stopped Switch Root.
Oct 08 18:48:55 compute-0 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Oct 08 18:48:55 compute-0 systemd[1]: Created slice Slice /system/getty.
Oct 08 18:48:55 compute-0 systemd[1]: Created slice Slice /system/serial-getty.
Oct 08 18:48:55 compute-0 systemd[1]: Created slice Slice /system/sshd-keygen.
Oct 08 18:48:55 compute-0 systemd[1]: Created slice User and Session Slice.
Oct 08 18:48:55 compute-0 systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Oct 08 18:48:55 compute-0 systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Oct 08 18:48:55 compute-0 systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Oct 08 18:48:55 compute-0 systemd[1]: Reached target Local Encrypted Volumes.
Oct 08 18:48:55 compute-0 systemd[1]: Stopped target Switch Root.
Oct 08 18:48:55 compute-0 systemd[1]: Stopped target Initrd File Systems.
Oct 08 18:48:55 compute-0 systemd[1]: Stopped target Initrd Root File System.
Oct 08 18:48:55 compute-0 systemd[1]: Reached target Local Integrity Protected Volumes.
Oct 08 18:48:55 compute-0 systemd[1]: Reached target Path Units.
Oct 08 18:48:55 compute-0 systemd[1]: Reached target rpc_pipefs.target.
Oct 08 18:48:55 compute-0 systemd[1]: Reached target Slice Units.
Oct 08 18:48:55 compute-0 systemd[1]: Reached target Local Verity Protected Volumes.
Oct 08 18:48:55 compute-0 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Oct 08 18:48:55 compute-0 systemd[1]: Listening on LVM2 poll daemon socket.
Oct 08 18:48:55 compute-0 systemd[1]: Listening on RPCbind Server Activation Socket.
Oct 08 18:48:55 compute-0 systemd[1]: Reached target RPC Port Mapper.
Oct 08 18:48:55 compute-0 systemd[1]: Listening on Process Core Dump Socket.
Oct 08 18:48:55 compute-0 systemd[1]: Listening on initctl Compatibility Named Pipe.
Oct 08 18:48:55 compute-0 systemd[1]: Listening on udev Control Socket.
Oct 08 18:48:55 compute-0 systemd[1]: Listening on udev Kernel Socket.
Oct 08 18:48:55 compute-0 systemd[1]: Mounting Huge Pages File System...
Oct 08 18:48:55 compute-0 systemd[1]: Mounting /dev/hugepages1G...
Oct 08 18:48:55 compute-0 systemd[1]: Mounting /dev/hugepages2M...
Oct 08 18:48:55 compute-0 systemd[1]: Mounting POSIX Message Queue File System...
Oct 08 18:48:55 compute-0 systemd[1]: Mounting Kernel Debug File System...
Oct 08 18:48:55 compute-0 systemd[1]: Mounting Kernel Trace File System...
Oct 08 18:48:55 compute-0 systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Oct 08 18:48:55 compute-0 systemd[1]: Starting Create List of Static Device Nodes...
Oct 08 18:48:55 compute-0 systemd[1]: Load legacy module configuration was skipped because no trigger condition checks were met.
Oct 08 18:48:55 compute-0 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Oct 08 18:48:55 compute-0 systemd[1]: Starting Load Kernel Module configfs...
Oct 08 18:48:55 compute-0 systemd[1]: Starting Load Kernel Module drm...
Oct 08 18:48:55 compute-0 systemd[1]: Starting Load Kernel Module efi_pstore...
Oct 08 18:48:55 compute-0 systemd[1]: Starting Load Kernel Module fuse...
Oct 08 18:48:55 compute-0 systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Oct 08 18:48:55 compute-0 systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Oct 08 18:48:55 compute-0 systemd[1]: Stopped File System Check on Root Device.
Oct 08 18:48:55 compute-0 systemd[1]: Stopped Journal Service.
Oct 08 18:48:55 compute-0 kernel: fuse: init (API version 7.37)
Oct 08 18:48:55 compute-0 systemd[1]: Starting Journal Service...
Oct 08 18:48:55 compute-0 systemd[1]: Starting Load Kernel Modules...
Oct 08 18:48:55 compute-0 systemd[1]: Starting Generate network units from Kernel command line...
Oct 08 18:48:55 compute-0 systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Oct 08 18:48:55 compute-0 systemd[1]: Starting Remount Root and Kernel File Systems...
Oct 08 18:48:55 compute-0 systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Oct 08 18:48:55 compute-0 systemd[1]: Starting Coldplug All udev Devices...
Oct 08 18:48:55 compute-0 systemd-journald[688]: Journal started
Oct 08 18:48:55 compute-0 systemd-journald[688]: Runtime Journal (/run/log/journal/42833e1b511a402df82cb9cb2fc36491) is 8.0M, max 153.5M, 145.5M free.
Oct 08 18:48:54 compute-0 systemd[1]: Queued start job for default target Multi-User System.
Oct 08 18:48:55 compute-0 systemd[1]: systemd-journald.service: Deactivated successfully.
Oct 08 18:48:55 compute-0 systemd[1]: Started Journal Service.
Oct 08 18:48:55 compute-0 systemd[1]: Mounted Huge Pages File System.
Oct 08 18:48:55 compute-0 kernel: ACPI: bus type drm_connector registered
Oct 08 18:48:55 compute-0 systemd[1]: Mounted /dev/hugepages1G.
Oct 08 18:48:55 compute-0 systemd[1]: Mounted /dev/hugepages2M.
Oct 08 18:48:55 compute-0 systemd[1]: Mounted POSIX Message Queue File System.
Oct 08 18:48:55 compute-0 systemd[1]: Mounted Kernel Debug File System.
Oct 08 18:48:55 compute-0 systemd[1]: Mounted Kernel Trace File System.
Oct 08 18:48:55 compute-0 systemd[1]: Finished Create List of Static Device Nodes.
Oct 08 18:48:55 compute-0 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Oct 08 18:48:55 compute-0 systemd[1]: Finished Load Kernel Module configfs.
Oct 08 18:48:55 compute-0 systemd[1]: modprobe@drm.service: Deactivated successfully.
Oct 08 18:48:55 compute-0 systemd[1]: Finished Load Kernel Module drm.
Oct 08 18:48:55 compute-0 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Oct 08 18:48:55 compute-0 systemd[1]: Finished Load Kernel Module efi_pstore.
Oct 08 18:48:55 compute-0 systemd[1]: modprobe@fuse.service: Deactivated successfully.
Oct 08 18:48:55 compute-0 systemd[1]: Finished Load Kernel Module fuse.
Oct 08 18:48:55 compute-0 systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Oct 08 18:48:55 compute-0 systemd[1]: Finished Generate network units from Kernel command line.
Oct 08 18:48:55 compute-0 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Oct 08 18:48:55 compute-0 systemd[1]: Mounting FUSE Control File System...
Oct 08 18:48:55 compute-0 systemd[1]: Mounted FUSE Control File System.
Oct 08 18:48:55 compute-0 kernel: Bridge firewalling registered
Oct 08 18:48:55 compute-0 systemd-modules-load[689]: Inserted module 'br_netfilter'
Oct 08 18:48:55 compute-0 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Oct 08 18:48:55 compute-0 systemd-modules-load[689]: Inserted module 'nf_conntrack'
Oct 08 18:48:55 compute-0 systemd[1]: Finished Load Kernel Modules.
Oct 08 18:48:55 compute-0 systemd[1]: Starting Apply Kernel Variables...
Oct 08 18:48:55 compute-0 systemd[1]: Finished Apply Kernel Variables.
Oct 08 18:48:55 compute-0 systemd[1]: Finished Coldplug All udev Devices.
Oct 08 18:48:55 compute-0 kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Oct 08 18:48:55 compute-0 systemd[1]: Finished Remount Root and Kernel File Systems.
Oct 08 18:48:55 compute-0 systemd[1]: Activating swap /swap...
Oct 08 18:48:55 compute-0 systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Oct 08 18:48:55 compute-0 systemd[1]: Rebuild Hardware Database was skipped because of an unmet condition check (ConditionNeedsUpdate=/etc).
Oct 08 18:48:55 compute-0 systemd[1]: Starting Flush Journal to Persistent Storage...
Oct 08 18:48:55 compute-0 systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Oct 08 18:48:55 compute-0 systemd[1]: Starting Load/Save OS Random Seed...
Oct 08 18:48:55 compute-0 systemd[1]: Create System Users was skipped because no trigger condition checks were met.
Oct 08 18:48:55 compute-0 systemd[1]: Starting Create Static Device Nodes in /dev...
Oct 08 18:48:55 compute-0 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Oct 08 18:48:55 compute-0 systemd[1]: Activated swap /swap.
Oct 08 18:48:55 compute-0 systemd[1]: Reached target Swaps.
Oct 08 18:48:55 compute-0 systemd-journald[688]: Time spent on flushing to /var/log/journal/42833e1b511a402df82cb9cb2fc36491 is 15.633ms for 786 entries.
Oct 08 18:48:55 compute-0 systemd-journald[688]: System Journal (/var/log/journal/42833e1b511a402df82cb9cb2fc36491) is 8.0M, max 4.0G, 3.9G free.
Oct 08 18:48:55 compute-0 systemd-journald[688]: Received client request to flush runtime journal.
Oct 08 18:48:55 compute-0 systemd[1]: Finished Load/Save OS Random Seed.
Oct 08 18:48:55 compute-0 systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Oct 08 18:48:55 compute-0 systemd[1]: Finished Flush Journal to Persistent Storage.
Oct 08 18:48:55 compute-0 systemd[1]: Finished Create Static Device Nodes in /dev.
Oct 08 18:48:55 compute-0 systemd[1]: Reached target Preparation for Local File Systems.
Oct 08 18:48:55 compute-0 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Oct 08 18:48:55 compute-0 systemd[1]: Reached target Local File Systems.
Oct 08 18:48:55 compute-0 systemd[1]: Starting Import network configuration from initramfs...
Oct 08 18:48:55 compute-0 systemd[1]: Rebuild Dynamic Linker Cache was skipped because no trigger condition checks were met.
Oct 08 18:48:55 compute-0 systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Oct 08 18:48:55 compute-0 systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Oct 08 18:48:55 compute-0 systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Oct 08 18:48:55 compute-0 systemd[1]: Starting Automatic Boot Loader Update...
Oct 08 18:48:55 compute-0 systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Oct 08 18:48:55 compute-0 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Oct 08 18:48:55 compute-0 bootctl[705]: Couldn't find EFI system partition, skipping.
Oct 08 18:48:55 compute-0 systemd[1]: Finished Automatic Boot Loader Update.
Oct 08 18:48:55 compute-0 systemd[1]: Finished Import network configuration from initramfs.
Oct 08 18:48:55 compute-0 systemd-udevd[706]: Using default interface naming scheme 'rhel-9.0'.
Oct 08 18:48:55 compute-0 systemd[1]: Starting Create Volatile Files and Directories...
Oct 08 18:48:55 compute-0 systemd[1]: Started Rule-based Manager for Device Events and Files.
Oct 08 18:48:55 compute-0 systemd[1]: Starting Load Kernel Module configfs...
Oct 08 18:48:55 compute-0 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Oct 08 18:48:55 compute-0 systemd[1]: Finished Load Kernel Module configfs.
Oct 08 18:48:55 compute-0 systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Oct 08 18:48:55 compute-0 systemd-udevd[741]: Network interface NamePolicy= disabled on kernel command line.
Oct 08 18:48:55 compute-0 kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Oct 08 18:48:55 compute-0 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Oct 08 18:48:55 compute-0 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Oct 08 18:48:55 compute-0 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Oct 08 18:48:55 compute-0 systemd[1]: Finished Create Volatile Files and Directories.
Oct 08 18:48:55 compute-0 systemd[1]: Starting Security Auditing Service...
Oct 08 18:48:55 compute-0 systemd[1]: Starting RPC Bind...
Oct 08 18:48:55 compute-0 systemd[1]: Rebuild Journal Catalog was skipped because of an unmet condition check (ConditionNeedsUpdate=/var).
Oct 08 18:48:55 compute-0 systemd[1]: Update is Completed was skipped because no trigger condition checks were met.
Oct 08 18:48:55 compute-0 auditd[775]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Oct 08 18:48:55 compute-0 auditd[775]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Oct 08 18:48:55 compute-0 systemd-udevd[723]: Network interface NamePolicy= disabled on kernel command line.
Oct 08 18:48:55 compute-0 systemd[1]: Started RPC Bind.
Oct 08 18:48:55 compute-0 augenrules[781]: /sbin/augenrules: No change
Oct 08 18:48:55 compute-0 augenrules[799]: No rules
Oct 08 18:48:55 compute-0 augenrules[799]: enabled 1
Oct 08 18:48:55 compute-0 augenrules[799]: failure 1
Oct 08 18:48:55 compute-0 augenrules[799]: pid 775
Oct 08 18:48:55 compute-0 augenrules[799]: rate_limit 0
Oct 08 18:48:55 compute-0 augenrules[799]: backlog_limit 8192
Oct 08 18:48:55 compute-0 augenrules[799]: lost 0
Oct 08 18:48:55 compute-0 augenrules[799]: backlog 3
Oct 08 18:48:55 compute-0 augenrules[799]: backlog_wait_time 60000
Oct 08 18:48:55 compute-0 augenrules[799]: backlog_wait_time_actual 0
Oct 08 18:48:55 compute-0 augenrules[799]: enabled 1
Oct 08 18:48:55 compute-0 augenrules[799]: failure 1
Oct 08 18:48:55 compute-0 augenrules[799]: pid 775
Oct 08 18:48:55 compute-0 augenrules[799]: rate_limit 0
Oct 08 18:48:55 compute-0 augenrules[799]: backlog_limit 8192
Oct 08 18:48:55 compute-0 augenrules[799]: lost 0
Oct 08 18:48:55 compute-0 augenrules[799]: backlog 0
Oct 08 18:48:55 compute-0 augenrules[799]: backlog_wait_time 60000
Oct 08 18:48:55 compute-0 augenrules[799]: backlog_wait_time_actual 0
Oct 08 18:48:55 compute-0 augenrules[799]: enabled 1
Oct 08 18:48:55 compute-0 augenrules[799]: failure 1
Oct 08 18:48:55 compute-0 augenrules[799]: pid 775
Oct 08 18:48:55 compute-0 augenrules[799]: rate_limit 0
Oct 08 18:48:55 compute-0 augenrules[799]: backlog_limit 8192
Oct 08 18:48:55 compute-0 augenrules[799]: lost 0
Oct 08 18:48:55 compute-0 augenrules[799]: backlog 0
Oct 08 18:48:55 compute-0 augenrules[799]: backlog_wait_time 60000
Oct 08 18:48:55 compute-0 augenrules[799]: backlog_wait_time_actual 0
Oct 08 18:48:55 compute-0 systemd[1]: Started Security Auditing Service.
Oct 08 18:48:55 compute-0 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Oct 08 18:48:55 compute-0 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Oct 08 18:48:55 compute-0 kernel: Console: switching to colour dummy device 80x25
Oct 08 18:48:55 compute-0 kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Oct 08 18:48:55 compute-0 kernel: [drm] features: -context_init
Oct 08 18:48:55 compute-0 kernel: [drm] number of scanouts: 1
Oct 08 18:48:55 compute-0 kernel: [drm] number of cap sets: 0
Oct 08 18:48:55 compute-0 systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Oct 08 18:48:55 compute-0 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Oct 08 18:48:55 compute-0 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Oct 08 18:48:55 compute-0 kernel: kvm_amd: TSC scaling supported
Oct 08 18:48:55 compute-0 kernel: kvm_amd: Nested Virtualization enabled
Oct 08 18:48:55 compute-0 kernel: kvm_amd: Nested Paging enabled
Oct 08 18:48:55 compute-0 kernel: kvm_amd: LBR virtualization supported
Oct 08 18:48:55 compute-0 kernel: Console: switching to colour frame buffer device 128x48
Oct 08 18:48:55 compute-0 systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Oct 08 18:48:55 compute-0 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Oct 08 18:48:56 compute-0 systemd[1]: Reached target System Initialization.
Oct 08 18:48:56 compute-0 systemd[1]: Started dnf makecache --timer.
Oct 08 18:48:56 compute-0 systemd[1]: Started Daily rotation of log files.
Oct 08 18:48:56 compute-0 systemd[1]: Started Run system activity accounting tool every 10 minutes.
Oct 08 18:48:56 compute-0 systemd[1]: Started Generate summary of yesterday's process accounting.
Oct 08 18:48:56 compute-0 systemd[1]: Started Daily Cleanup of Temporary Directories.
Oct 08 18:48:56 compute-0 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Oct 08 18:48:56 compute-0 systemd[1]: Reached target Timer Units.
Oct 08 18:48:56 compute-0 systemd[1]: Listening on D-Bus System Message Bus Socket.
Oct 08 18:48:56 compute-0 systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Oct 08 18:48:56 compute-0 systemd[1]: Reached target Socket Units.
Oct 08 18:48:56 compute-0 systemd[1]: Starting D-Bus System Message Bus...
Oct 08 18:48:56 compute-0 systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Oct 08 18:48:56 compute-0 systemd[1]: Started D-Bus System Message Bus.
Oct 08 18:48:56 compute-0 systemd[1]: Reached target Basic System.
Oct 08 18:48:56 compute-0 dbus-broker-lau[835]: Ready
Oct 08 18:48:56 compute-0 systemd[1]: Starting NTP client/server...
Oct 08 18:48:56 compute-0 systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Oct 08 18:48:56 compute-0 systemd[1]: Starting Restore /run/initramfs on shutdown...
Oct 08 18:48:56 compute-0 systemd[1]: Started irqbalance daemon.
Oct 08 18:48:56 compute-0 systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Oct 08 18:48:56 compute-0 systemd[1]: Starting Create netns directory...
Oct 08 18:48:56 compute-0 systemd[1]: Starting Netfilter Tables...
Oct 08 18:48:56 compute-0 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 08 18:48:56 compute-0 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 08 18:48:56 compute-0 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 08 18:48:56 compute-0 systemd[1]: Reached target sshd-keygen.target.
Oct 08 18:48:56 compute-0 systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Oct 08 18:48:56 compute-0 systemd[1]: Reached target User and Group Name Lookups.
Oct 08 18:48:56 compute-0 systemd[1]: Starting Resets System Activity Logs...
Oct 08 18:48:56 compute-0 systemd[1]: Starting User Login Management...
Oct 08 18:48:56 compute-0 systemd[1]: Finished Restore /run/initramfs on shutdown.
Oct 08 18:48:56 compute-0 systemd[1]: Finished Resets System Activity Logs.
Oct 08 18:48:56 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 08 18:48:56 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 08 18:48:56 compute-0 systemd[1]: Finished Create netns directory.
Oct 08 18:48:56 compute-0 chronyd[850]: chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Oct 08 18:48:56 compute-0 chronyd[850]: Frequency -28.709 +/- 0.114 ppm read from /var/lib/chrony/drift
Oct 08 18:48:56 compute-0 chronyd[850]: Loaded seccomp filter (level 2)
Oct 08 18:48:56 compute-0 systemd[1]: Started NTP client/server.
Oct 08 18:48:56 compute-0 systemd-logind[844]: New seat seat0.
Oct 08 18:48:56 compute-0 systemd-logind[844]: Watching system buttons on /dev/input/event0 (Power Button)
Oct 08 18:48:56 compute-0 systemd-logind[844]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Oct 08 18:48:56 compute-0 systemd[1]: Started User Login Management.
Oct 08 18:48:56 compute-0 systemd[1]: Finished Netfilter Tables.
Oct 08 18:48:57 compute-0 cloud-init[870]: Cloud-init v. 24.4-7.el9 running 'init-local' at Wed, 08 Oct 2025 18:48:57 +0000. Up 8.27 seconds.
Oct 08 18:48:57 compute-0 systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Oct 08 18:48:57 compute-0 systemd[1]: Reached target Preparation for Network.
Oct 08 18:48:57 compute-0 systemd[1]: Starting Open vSwitch Database Unit...
Oct 08 18:48:57 compute-0 chown[872]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Oct 08 18:48:58 compute-0 ovs-ctl[877]: Starting ovsdb-server [  OK  ]
Oct 08 18:48:58 compute-0 ovs-vsctl[926]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Oct 08 18:48:58 compute-0 ovs-vsctl[936]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"47f81f7a-64d8-418a-a74c-b879bd6deb83\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Oct 08 18:48:58 compute-0 ovs-ctl[877]: Configuring Open vSwitch system IDs [  OK  ]
Oct 08 18:48:58 compute-0 ovs-vsctl[942]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Oct 08 18:48:58 compute-0 ovs-ctl[877]: Enabling remote OVSDB managers [  OK  ]
Oct 08 18:48:58 compute-0 systemd[1]: Started Open vSwitch Database Unit.
Oct 08 18:48:58 compute-0 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Oct 08 18:48:58 compute-0 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Oct 08 18:48:58 compute-0 systemd[1]: Starting Open vSwitch Forwarding Unit...
Oct 08 18:48:58 compute-0 kernel: openvswitch: Open vSwitch switching datapath
Oct 08 18:48:58 compute-0 ovs-ctl[987]: Inserting openvswitch module [  OK  ]
Oct 08 18:48:58 compute-0 kernel: ovs-system: entered promiscuous mode
Oct 08 18:48:58 compute-0 kernel: Timeout policy base is empty
Oct 08 18:48:58 compute-0 systemd-udevd[1012]: Network interface NamePolicy= disabled on kernel command line.
Oct 08 18:48:58 compute-0 kernel: vlan22: entered promiscuous mode
Oct 08 18:48:58 compute-0 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Oct 08 18:48:58 compute-0 systemd-udevd[1013]: Network interface NamePolicy= disabled on kernel command line.
Oct 08 18:48:58 compute-0 kernel: vlan20: entered promiscuous mode
Oct 08 18:48:58 compute-0 systemd-udevd[1022]: Network interface NamePolicy= disabled on kernel command line.
Oct 08 18:48:58 compute-0 kernel: vlan21: entered promiscuous mode
Oct 08 18:48:58 compute-0 ovs-ctl[956]: Starting ovs-vswitchd [  OK  ]
Oct 08 18:48:58 compute-0 ovs-vsctl[1033]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Oct 08 18:48:58 compute-0 ovs-ctl[956]: Enabling remote OVSDB managers [  OK  ]
Oct 08 18:48:58 compute-0 systemd[1]: Started Open vSwitch Forwarding Unit.
Oct 08 18:48:58 compute-0 systemd[1]: Starting Open vSwitch...
Oct 08 18:48:58 compute-0 systemd[1]: Finished Open vSwitch.
Oct 08 18:48:58 compute-0 systemd[1]: Starting Network Manager...
Oct 08 18:48:58 compute-0 NetworkManager[1035]: <info>  [1759949338.8736] NetworkManager (version 1.54.1-1.el9) is starting... (boot:d538b0ba-483c-4d09-9bda-0412f54534f3)
Oct 08 18:48:58 compute-0 NetworkManager[1035]: <info>  [1759949338.8742] Read config: /etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf
Oct 08 18:48:58 compute-0 NetworkManager[1035]: <info>  [1759949338.8925] manager[0x55b9c6fb6040]: monitoring kernel firmware directory '/lib/firmware'.
Oct 08 18:48:58 compute-0 systemd[1]: Starting Hostname Service...
Oct 08 18:48:58 compute-0 systemd[1]: Started Hostname Service.
Oct 08 18:48:58 compute-0 NetworkManager[1035]: <info>  [1759949338.9706] hostname: hostname: using hostnamed
Oct 08 18:48:58 compute-0 NetworkManager[1035]: <info>  [1759949338.9706] hostname: static hostname changed from (none) to "compute-0"
Oct 08 18:48:58 compute-0 NetworkManager[1035]: <info>  [1759949338.9714] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Oct 08 18:48:58 compute-0 NetworkManager[1035]: <info>  [1759949338.9876] manager[0x55b9c6fb6040]: rfkill: Wi-Fi hardware radio set enabled
Oct 08 18:48:58 compute-0 NetworkManager[1035]: <info>  [1759949338.9876] manager[0x55b9c6fb6040]: rfkill: WWAN hardware radio set enabled
Oct 08 18:48:58 compute-0 systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Oct 08 18:48:58 compute-0 NetworkManager[1035]: <info>  [1759949338.9984] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0023] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0024] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0025] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0026] manager: Networking is enabled by state file
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0033] settings: Loaded settings plugin: keyfile (internal)
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0064] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0157] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0179] dhcp: init: Using DHCP client 'internal'
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0181] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0191] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0202] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct 08 18:48:59 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0220] device (lo): Activation: starting connection 'lo' (aed4deb5-95bc-489e-9824-933efef54b8c)
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0227] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0229] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0247] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/3)
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0249] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0262] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/4)
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0264] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0278] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/5)
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0280] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0293] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/6)
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0297] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0310] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0312] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0317] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0319] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0324] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/9)
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0325] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0330] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/10)
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0332] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0337] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/11)
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0340] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0345] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/12)
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0347] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0352] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/13)
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0354] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0374] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Oct 08 18:48:59 compute-0 systemd[1]: Started Network Manager.
Oct 08 18:48:59 compute-0 systemd[1]: Reached target Network.
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0391] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0403] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0405] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0406] device (eth0): carrier: link connected
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0407] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0408] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0409] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0410] device (eth1): carrier: link connected
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0415] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0421] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0425] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Oct 08 18:48:59 compute-0 kernel: vlan20: left promiscuous mode
Oct 08 18:48:59 compute-0 systemd[1]: Starting Network Manager Wait Online...
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <error> [1759949339.0441] platform-linux: sysctl: failed to set '/proc/sys/net/ipv6/conf/vlan20/disable_ipv6' to '1': (2) No such file or directory
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0454] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0461] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0464] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0469] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0470] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0472] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0473] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0474] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 systemd[1]: Starting GSSAPI Proxy Daemon...
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0488] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0496] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0498] policy: auto-activating connection 'ci-private-network' (f659475b-7c6f-5319-b371-519bc515c6f0)
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0500] policy: auto-activating connection 'eth1-port' (19cf4192-6bcb-444b-ae5b-b65ed7eb80f5)
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0501] policy: auto-activating connection 'vlan20-port' (250e055d-e2d6-47d8-a97e-ae3fcd2ad51e)
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0502] policy: auto-activating connection 'br-ex-port' (8546a6a0-ff01-4e77-9a80-0ccb84b09e15)
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0503] policy: auto-activating connection 'vlan21-port' (ae6c2a0f-6e02-482e-bde7-7c80cfee7790)
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0503] policy: auto-activating connection 'br-ex-br' (e5fe6336-0393-4f39-89c9-10707afd900f)
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0504] policy: auto-activating connection 'vlan22-port' (f99e4b34-6f01-46fc-9977-6ae5bcc3a7ce)
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0507] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0513] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0515] device (eth1): Activation: starting connection 'ci-private-network' (f659475b-7c6f-5319-b371-519bc515c6f0)
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0517] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (19cf4192-6bcb-444b-ae5b-b65ed7eb80f5)
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0519] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (250e055d-e2d6-47d8-a97e-ae3fcd2ad51e)
Oct 08 18:48:59 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0532] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (8546a6a0-ff01-4e77-9a80-0ccb84b09e15)
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0533] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (ae6c2a0f-6e02-482e-bde7-7c80cfee7790)
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0538] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (e5fe6336-0393-4f39-89c9-10707afd900f)
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0540] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (f99e4b34-6f01-46fc-9977-6ae5bcc3a7ce)
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0543] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0545] manager: NetworkManager state is now CONNECTING
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0546] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0551] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0567] device (eth1): state change: prepare -> deactivating (reason 'new-activation', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0572] device (eth1): disconnecting for new activation request.
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0572] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0574] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0576] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0578] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0581] device (br-ex)[Open vSwitch Port]: state change: prepare -> deactivating (reason 'new-activation', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0585] device (br-ex)[Open vSwitch Port]: disconnecting for new activation request.
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0587] device (eth1)[Open vSwitch Port]: state change: prepare -> deactivating (reason 'new-activation', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0595] device (eth1)[Open vSwitch Port]: disconnecting for new activation request.
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0596] device (vlan20)[Open vSwitch Port]: state change: prepare -> deactivating (reason 'new-activation', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0602] device (vlan20)[Open vSwitch Port]: disconnecting for new activation request.
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0603] device (vlan21)[Open vSwitch Port]: state change: prepare -> deactivating (reason 'new-activation', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0609] device (vlan21)[Open vSwitch Port]: disconnecting for new activation request.
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0609] device (vlan22)[Open vSwitch Port]: state change: disconnected -> deactivating (reason 'new-activation', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0616] device (vlan22)[Open vSwitch Port]: disconnecting for new activation request.
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0617] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0619] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0621] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0625] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 kernel: vlan21: left promiscuous mode
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0630] dhcp4 (eth0): activation: beginning transaction (no timeout)
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0656] device (eth1): disconnecting for new activation request.
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0660] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0665] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0666] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0713] device (eth1): Activation: starting connection 'ci-private-network' (f659475b-7c6f-5319-b371-519bc515c6f0)
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0716] device (br-ex)[Open vSwitch Port]: state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0720] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (8546a6a0-ff01-4e77-9a80-0ccb84b09e15)
Oct 08 18:48:59 compute-0 kernel: vlan22: left promiscuous mode
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0755] dhcp4 (eth0): state changed new lease, address=38.102.83.120
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0763] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0803] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0813] policy: auto-activating connection 'vlan20-if' (7ec5abe9-78ff-4cf6-b429-5715d559f836)
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0815] policy: auto-activating connection 'vlan21-if' (de041523-e9cd-491b-892a-fc68198a2acd)
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0817] policy: auto-activating connection 'vlan22-if' (52d34180-b43f-4c96-a950-d37fc7c59cd0)
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0818] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0825] device (lo): Activation: successful, device activated.
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0839] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 08 18:48:59 compute-0 systemd[1]: Started GSSAPI Proxy Daemon.
Oct 08 18:48:59 compute-0 kernel: virtio_net virtio5 eth1: left promiscuous mode
Oct 08 18:48:59 compute-0 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0859] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 systemd[1]: Reached target NFS client services.
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0864] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0867] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0869] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0870] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0871] device (eth1)[Open vSwitch Port]: state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Oct 08 18:48:59 compute-0 systemd[1]: Reached target Preparation for Remote File Systems.
Oct 08 18:48:59 compute-0 systemd[1]: Reached target Remote File Systems.
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0902] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (19cf4192-6bcb-444b-ae5b-b65ed7eb80f5)
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0906] device (vlan20)[Open vSwitch Port]: state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0915] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (250e055d-e2d6-47d8-a97e-ae3fcd2ad51e)
Oct 08 18:48:59 compute-0 systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Oct 08 18:48:59 compute-0 kernel: ovs-system: left promiscuous mode
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0920] device (vlan21)[Open vSwitch Port]: state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0929] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (ae6c2a0f-6e02-482e-bde7-7c80cfee7790)
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0933] device (vlan22)[Open vSwitch Port]: state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0941] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (f99e4b34-6f01-46fc-9977-6ae5bcc3a7ce)
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0944] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0952] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1035] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1043] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1052] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (7ec5abe9-78ff-4cf6-b429-5715d559f836)
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1054] policy: auto-activating connection 'vlan21-if' (de041523-e9cd-491b-892a-fc68198a2acd)
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1060] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1073] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1080] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1086] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1089] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1092] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1099] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1103] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1107] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1110] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1116] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1120] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1122] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1125] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1132] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1136] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1139] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1141] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1149] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1162] policy: auto-activating connection 'vlan22-if' (52d34180-b43f-4c96-a950-d37fc7c59cd0)
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1165] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1170] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1175] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1182] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1186] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1199] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1207] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1215] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (de041523-e9cd-491b-892a-fc68198a2acd)
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1216] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 kernel: ovs-system: entered promiscuous mode
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1228] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1234] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1240] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1246] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 kernel: No such timeout policy "ovs_test_tp"
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1253] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1257] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1262] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (52d34180-b43f-4c96-a950-d37fc7c59cd0)
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1262] policy: auto-activating connection 'br-ex-if' (794f8bbe-0f95-43e1-a59b-5efcb30fbf56)
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1265] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1272] device (eth0): Activation: successful, device activated.
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1279] manager: NetworkManager state is now CONNECTED_GLOBAL
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1283] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1287] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1293] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1298] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1301] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1311] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1320] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 kernel: vlan20: entered promiscuous mode
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1348] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1351] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1362] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (794f8bbe-0f95-43e1-a59b-5efcb30fbf56)
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1362] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1368] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1376] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1379] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1380] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1385] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1391] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1394] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1396] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1403] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1407] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1411] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 kernel: vlan21: entered promiscuous mode
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1417] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1424] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1431] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1455] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1463] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Oct 08 18:48:59 compute-0 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1472] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1477] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1483] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1501] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1504] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1509] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1530] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1543] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1554] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1562] device (eth1): Activation: successful, device activated.
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1573] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1577] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1583] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1592] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1608] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1645] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1647] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1653] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Oct 08 18:48:59 compute-0 kernel: br-ex: entered promiscuous mode
Oct 08 18:48:59 compute-0 kernel: vlan22: entered promiscuous mode
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1744] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1755] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1769] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1771] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1777] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1844] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1854] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1881] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1882] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1887] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Oct 08 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1894] manager: startup complete
Oct 08 18:48:59 compute-0 systemd[1]: Finished Network Manager Wait Online.
Oct 08 18:48:59 compute-0 systemd[1]: Starting Cloud-init: Network Stage...
Oct 08 18:48:59 compute-0 systemd[1]: Starting Authorization Manager...
Oct 08 18:48:59 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Oct 08 18:48:59 compute-0 polkitd[1188]: Started polkitd version 0.117
Oct 08 18:48:59 compute-0 polkitd[1188]: Loading rules from directory /etc/polkit-1/rules.d
Oct 08 18:48:59 compute-0 polkitd[1188]: Loading rules from directory /usr/share/polkit-1/rules.d
Oct 08 18:48:59 compute-0 polkitd[1188]: Finished loading, compiling and executing 3 rules
Oct 08 18:48:59 compute-0 systemd[1]: Started Authorization Manager.
Oct 08 18:48:59 compute-0 polkitd[1188]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Oct 08 18:48:59 compute-0 cloud-init[1253]: Cloud-init v. 24.4-7.el9 running 'init' at Wed, 08 Oct 2025 18:48:59 +0000. Up 10.12 seconds.
Oct 08 18:48:59 compute-0 cloud-init[1253]: ci-info: +++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++
Oct 08 18:48:59 compute-0 cloud-init[1253]: ci-info: +------------+-------+-----------------+---------------+--------+-------------------+
Oct 08 18:48:59 compute-0 cloud-init[1253]: ci-info: |   Device   |   Up  |     Address     |      Mask     | Scope  |     Hw-Address    |
Oct 08 18:48:59 compute-0 cloud-init[1253]: ci-info: +------------+-------+-----------------+---------------+--------+-------------------+
Oct 08 18:48:59 compute-0 cloud-init[1253]: ci-info: |   br-ex    |  True | 192.168.122.100 | 255.255.255.0 | global | fa:16:3e:7a:c4:21 |
Oct 08 18:48:59 compute-0 cloud-init[1253]: ci-info: |    eth0    |  True |  38.102.83.120  | 255.255.255.0 | global | fa:16:3e:22:ef:71 |
Oct 08 18:48:59 compute-0 cloud-init[1253]: ci-info: |    eth1    |  True |        .        |       .       |   .    | fa:16:3e:7a:c4:21 |
Oct 08 18:48:59 compute-0 cloud-init[1253]: ci-info: |     lo     |  True |    127.0.0.1    |   255.0.0.0   |  host  |         .         |
Oct 08 18:48:59 compute-0 cloud-init[1253]: ci-info: |     lo     |  True |     ::1/128     |       .       |  host  |         .         |
Oct 08 18:48:59 compute-0 cloud-init[1253]: ci-info: | ovs-system | False |        .        |       .       |   .    | 96:5c:8d:a9:f1:5f |
Oct 08 18:48:59 compute-0 cloud-init[1253]: ci-info: |   vlan20   |  True |   172.17.0.100  | 255.255.255.0 | global | 0a:84:82:fa:ae:3b |
Oct 08 18:48:59 compute-0 cloud-init[1253]: ci-info: |   vlan21   |  True |   172.18.0.100  | 255.255.255.0 | global | 16:5f:9c:c1:6f:96 |
Oct 08 18:48:59 compute-0 cloud-init[1253]: ci-info: |   vlan22   |  True |   172.19.0.100  | 255.255.255.0 | global | 16:5b:f6:63:52:30 |
Oct 08 18:48:59 compute-0 cloud-init[1253]: ci-info: +------------+-------+-----------------+---------------+--------+-------------------+
Oct 08 18:48:59 compute-0 cloud-init[1253]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Oct 08 18:48:59 compute-0 cloud-init[1253]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Oct 08 18:48:59 compute-0 cloud-init[1253]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Oct 08 18:48:59 compute-0 cloud-init[1253]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Oct 08 18:48:59 compute-0 cloud-init[1253]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Oct 08 18:48:59 compute-0 cloud-init[1253]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Oct 08 18:48:59 compute-0 cloud-init[1253]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Oct 08 18:48:59 compute-0 cloud-init[1253]: ci-info: |   3   |    172.17.0.0   |    0.0.0.0    |  255.255.255.0  |   vlan20  |   U   |
Oct 08 18:48:59 compute-0 cloud-init[1253]: ci-info: |   4   |    172.18.0.0   |    0.0.0.0    |  255.255.255.0  |   vlan21  |   U   |
Oct 08 18:48:59 compute-0 cloud-init[1253]: ci-info: |   5   |    172.19.0.0   |    0.0.0.0    |  255.255.255.0  |   vlan22  |   U   |
Oct 08 18:48:59 compute-0 cloud-init[1253]: ci-info: |   6   |  192.168.122.0  |    0.0.0.0    |  255.255.255.0  |   br-ex   |   U   |
Oct 08 18:48:59 compute-0 cloud-init[1253]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Oct 08 18:48:59 compute-0 cloud-init[1253]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Oct 08 18:48:59 compute-0 cloud-init[1253]: ci-info: +-------+-------------+---------+-----------+-------+
Oct 08 18:48:59 compute-0 cloud-init[1253]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Oct 08 18:48:59 compute-0 cloud-init[1253]: ci-info: +-------+-------------+---------+-----------+-------+
Oct 08 18:48:59 compute-0 cloud-init[1253]: ci-info: |   2   |  multicast  |    ::   |    eth1   |   U   |
Oct 08 18:48:59 compute-0 cloud-init[1253]: ci-info: +-------+-------------+---------+-----------+-------+
Oct 08 18:48:59 compute-0 systemd[1]: Finished Cloud-init: Network Stage.
Oct 08 18:48:59 compute-0 systemd[1]: Reached target Cloud-config availability.
Oct 08 18:48:59 compute-0 systemd[1]: Reached target Network is Online.
Oct 08 18:49:00 compute-0 systemd[1]: Starting Cloud-init: Config Stage...
Oct 08 18:49:00 compute-0 systemd[1]: Starting EDPM Container Shutdown...
Oct 08 18:49:00 compute-0 systemd[1]: Starting Notify NFS peers of a restart...
Oct 08 18:49:00 compute-0 systemd[1]: Starting System Logging Service...
Oct 08 18:49:00 compute-0 sm-notify[1287]: Version 2.5.4 starting
Oct 08 18:49:00 compute-0 systemd[1]: Starting OpenSSH server daemon...
Oct 08 18:49:00 compute-0 systemd[1]: Starting Permit User Sessions...
Oct 08 18:49:00 compute-0 systemd[1]: Finished EDPM Container Shutdown.
Oct 08 18:49:00 compute-0 systemd[1]: Started Notify NFS peers of a restart.
Oct 08 18:49:00 compute-0 sshd[1289]: Server listening on 0.0.0.0 port 22.
Oct 08 18:49:00 compute-0 sshd[1289]: Server listening on :: port 22.
Oct 08 18:49:00 compute-0 systemd[1]: Started OpenSSH server daemon.
Oct 08 18:49:00 compute-0 systemd[1]: Finished Permit User Sessions.
Oct 08 18:49:00 compute-0 systemd[1]: Started Command Scheduler.
Oct 08 18:49:00 compute-0 systemd[1]: Started Getty on tty1.
Oct 08 18:49:00 compute-0 systemd[1]: Started Serial Getty on ttyS0.
Oct 08 18:49:00 compute-0 crond[1291]: (CRON) STARTUP (1.5.7)
Oct 08 18:49:00 compute-0 crond[1291]: (CRON) INFO (Syslog will be used instead of sendmail.)
Oct 08 18:49:00 compute-0 systemd[1]: Reached target Login Prompts.
Oct 08 18:49:00 compute-0 crond[1291]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 80% if used.)
Oct 08 18:49:00 compute-0 crond[1291]: (CRON) INFO (running with inotify support)
Oct 08 18:49:00 compute-0 rsyslogd[1288]: [origin software="rsyslogd" swVersion="8.2506.0-2.el9" x-pid="1288" x-info="https://www.rsyslog.com"] start
Oct 08 18:49:00 compute-0 systemd[1]: Started System Logging Service.
Oct 08 18:49:00 compute-0 systemd[1]: Reached target Multi-User System.
Oct 08 18:49:00 compute-0 systemd[1]: Starting Record Runlevel Change in UTMP...
Oct 08 18:49:00 compute-0 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Oct 08 18:49:00 compute-0 systemd[1]: Finished Record Runlevel Change in UTMP.
Oct 08 18:49:00 compute-0 rsyslogd[1288]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 08 18:49:00 compute-0 cloud-init[1300]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Wed, 08 Oct 2025 18:49:00 +0000. Up 10.94 seconds.
Oct 08 18:49:00 compute-0 systemd[1]: Finished Cloud-init: Config Stage.
Oct 08 18:49:00 compute-0 systemd[1]: Starting Cloud-init: Final Stage...
Oct 08 18:49:00 compute-0 cloud-init[1304]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Wed, 08 Oct 2025 18:49:00 +0000. Up 11.31 seconds.
Oct 08 18:49:00 compute-0 cloud-init[1304]: Cloud-init v. 24.4-7.el9 finished at Wed, 08 Oct 2025 18:49:00 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 11.36 seconds
Oct 08 18:49:00 compute-0 systemd[1]: Finished Cloud-init: Final Stage.
Oct 08 18:49:00 compute-0 systemd[1]: Reached target Cloud-init target.
Oct 08 18:49:00 compute-0 systemd[1]: Startup finished in 1.658s (kernel) + 3.210s (initrd) + 6.550s (userspace) = 11.419s.
Oct 08 18:49:06 compute-0 irqbalance[840]: Cannot change IRQ 25 affinity: Operation not permitted
Oct 08 18:49:06 compute-0 irqbalance[840]: IRQ 25 affinity is now unmanaged
Oct 08 18:49:06 compute-0 irqbalance[840]: Cannot change IRQ 31 affinity: Operation not permitted
Oct 08 18:49:06 compute-0 irqbalance[840]: IRQ 31 affinity is now unmanaged
Oct 08 18:49:06 compute-0 irqbalance[840]: Cannot change IRQ 28 affinity: Operation not permitted
Oct 08 18:49:06 compute-0 irqbalance[840]: IRQ 28 affinity is now unmanaged
Oct 08 18:49:06 compute-0 irqbalance[840]: Cannot change IRQ 26 affinity: Operation not permitted
Oct 08 18:49:06 compute-0 irqbalance[840]: IRQ 26 affinity is now unmanaged
Oct 08 18:49:06 compute-0 irqbalance[840]: Cannot change IRQ 32 affinity: Operation not permitted
Oct 08 18:49:06 compute-0 irqbalance[840]: IRQ 32 affinity is now unmanaged
Oct 08 18:49:06 compute-0 irqbalance[840]: Cannot change IRQ 30 affinity: Operation not permitted
Oct 08 18:49:06 compute-0 irqbalance[840]: IRQ 30 affinity is now unmanaged
Oct 08 18:49:06 compute-0 irqbalance[840]: Cannot change IRQ 29 affinity: Operation not permitted
Oct 08 18:49:06 compute-0 irqbalance[840]: IRQ 29 affinity is now unmanaged
Oct 08 18:49:06 compute-0 irqbalance[840]: Cannot change IRQ 27 affinity: Operation not permitted
Oct 08 18:49:06 compute-0 irqbalance[840]: IRQ 27 affinity is now unmanaged
Oct 08 18:49:09 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct 08 18:49:29 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct 08 18:49:40 compute-0 sshd-session[1310]: Accepted publickey for zuul from 192.168.122.30 port 45010 ssh2: ECDSA SHA256:i+73Mx2Y/ukt1b+huf+9w+ftZalnyybbDU6glTR0JfU
Oct 08 18:49:40 compute-0 systemd[1]: Created slice User Slice of UID 1000.
Oct 08 18:49:40 compute-0 systemd[1]: Starting User Runtime Directory /run/user/1000...
Oct 08 18:49:40 compute-0 systemd-logind[844]: New session 1 of user zuul.
Oct 08 18:49:40 compute-0 systemd[1]: Finished User Runtime Directory /run/user/1000.
Oct 08 18:49:40 compute-0 systemd[1]: Starting User Manager for UID 1000...
Oct 08 18:49:40 compute-0 systemd[1314]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 08 18:49:40 compute-0 systemd[1314]: Queued start job for default target Main User Target.
Oct 08 18:49:40 compute-0 systemd[1314]: Created slice User Application Slice.
Oct 08 18:49:40 compute-0 rsyslogd[1288]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 08 18:49:40 compute-0 systemd[1314]: Started Mark boot as successful after the user session has run 2 minutes.
Oct 08 18:49:40 compute-0 systemd[1314]: Started Daily Cleanup of User's Temporary Directories.
Oct 08 18:49:40 compute-0 systemd[1314]: Reached target Paths.
Oct 08 18:49:40 compute-0 systemd[1314]: Reached target Timers.
Oct 08 18:49:40 compute-0 systemd[1314]: Starting D-Bus User Message Bus Socket...
Oct 08 18:49:40 compute-0 systemd[1314]: Starting Create User's Volatile Files and Directories...
Oct 08 18:49:40 compute-0 systemd[1314]: Finished Create User's Volatile Files and Directories.
Oct 08 18:49:40 compute-0 systemd[1314]: Listening on D-Bus User Message Bus Socket.
Oct 08 18:49:40 compute-0 systemd[1314]: Reached target Sockets.
Oct 08 18:49:40 compute-0 systemd[1314]: Reached target Basic System.
Oct 08 18:49:40 compute-0 systemd[1314]: Reached target Main User Target.
Oct 08 18:49:40 compute-0 systemd[1314]: Startup finished in 184ms.
Oct 08 18:49:40 compute-0 systemd[1]: Started User Manager for UID 1000.
Oct 08 18:49:40 compute-0 systemd[1]: Started Session 1 of User zuul.
Oct 08 18:49:40 compute-0 sshd-session[1310]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 08 18:49:40 compute-0 sudo[1356]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxweftvoawwmanvagcpwydcqxjmvmtdl ; cat /proc/sys/kernel/random/boot_id'
Oct 08 18:49:40 compute-0 sudo[1356]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:49:40 compute-0 sudo[1356]: pam_unix(sudo:session): session closed for user root
Oct 08 18:49:41 compute-0 sudo[1385]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfpjoehpkwvnpbiotfssudurjnvgkrmm ; whoami'
Oct 08 18:49:41 compute-0 sudo[1385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:49:41 compute-0 sudo[1385]: pam_unix(sudo:session): session closed for user root
Oct 08 18:49:41 compute-0 sudo[1537]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wztnxzeiakuenfhkcjyfgdezqbxflzhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949380.8056605-125-18657848218222/AnsiballZ_file.py'
Oct 08 18:49:41 compute-0 sudo[1537]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:49:41 compute-0 python3.9[1539]: ansible-ansible.builtin.file Invoked with path=/var/lib/openstack/reboot_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:49:41 compute-0 sudo[1537]: pam_unix(sudo:session): session closed for user root
Oct 08 18:49:42 compute-0 sshd-session[1329]: Connection closed by 192.168.122.30 port 45010
Oct 08 18:49:42 compute-0 sshd-session[1310]: pam_unix(sshd:session): session closed for user zuul
Oct 08 18:49:42 compute-0 systemd[1]: session-1.scope: Deactivated successfully.
Oct 08 18:49:42 compute-0 systemd-logind[844]: Session 1 logged out. Waiting for processes to exit.
Oct 08 18:49:42 compute-0 systemd-logind[844]: Removed session 1.
Oct 08 18:49:47 compute-0 sshd-session[1564]: Accepted publickey for zuul from 192.168.122.30 port 46310 ssh2: ECDSA SHA256:i+73Mx2Y/ukt1b+huf+9w+ftZalnyybbDU6glTR0JfU
Oct 08 18:49:47 compute-0 systemd-logind[844]: New session 3 of user zuul.
Oct 08 18:49:47 compute-0 systemd[1]: Started Session 3 of User zuul.
Oct 08 18:49:47 compute-0 sshd-session[1564]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 08 18:49:48 compute-0 python3.9[1717]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 08 18:49:49 compute-0 sudo[1871]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtcyexokytoauvwfrkjvjrlophkesrqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949388.967516-50-86557769633094/AnsiballZ_file.py'
Oct 08 18:49:49 compute-0 sudo[1871]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:49:50 compute-0 python3.9[1873]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:49:50 compute-0 sudo[1871]: pam_unix(sudo:session): session closed for user root
Oct 08 18:49:50 compute-0 sudo[2023]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djuzeyhikbvyvsvpxrghdscaixkshdwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949389.833654-50-273261219635061/AnsiballZ_file.py'
Oct 08 18:49:50 compute-0 sudo[2023]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:49:50 compute-0 python3.9[2025]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:49:50 compute-0 sudo[2023]: pam_unix(sudo:session): session closed for user root
Oct 08 18:49:51 compute-0 sudo[2175]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrpgazkkyjzgdodzsvksycuuhtyohqev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949390.5945773-65-224496674355243/AnsiballZ_stat.py'
Oct 08 18:49:51 compute-0 sudo[2175]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:49:51 compute-0 python3.9[2177]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:49:51 compute-0 sudo[2175]: pam_unix(sudo:session): session closed for user root
Oct 08 18:49:52 compute-0 sudo[2298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjfvduuatuafenvtlowheojijqbrirru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949390.5945773-65-224496674355243/AnsiballZ_copy.py'
Oct 08 18:49:52 compute-0 sudo[2298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:49:52 compute-0 python3.9[2300]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949390.5945773-65-224496674355243/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=be98bc5d9b8806bf64c094b8d28e699f7676d168 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:49:52 compute-0 sudo[2298]: pam_unix(sudo:session): session closed for user root
Oct 08 18:49:53 compute-0 sudo[2450]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzwrlzfbeaqqfqjxelozxvsjwgfivwdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949392.3440707-65-113709593892587/AnsiballZ_stat.py'
Oct 08 18:49:53 compute-0 sudo[2450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:49:53 compute-0 python3.9[2452]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:49:53 compute-0 sudo[2450]: pam_unix(sudo:session): session closed for user root
Oct 08 18:49:53 compute-0 sudo[2573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzggpmkmvpzkimewhqyhzlorabtfgzra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949392.3440707-65-113709593892587/AnsiballZ_copy.py'
Oct 08 18:49:53 compute-0 sudo[2573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:49:53 compute-0 python3.9[2575]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949392.3440707-65-113709593892587/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=597e518065fbb1bf8e8180d3d00442be9597b2b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:49:54 compute-0 sudo[2573]: pam_unix(sudo:session): session closed for user root
Oct 08 18:49:54 compute-0 sudo[2725]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zngeqrosbbxtanhgyrvrbzmfpxogfgcw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949393.6744804-65-9693660716769/AnsiballZ_stat.py'
Oct 08 18:49:54 compute-0 sudo[2725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:49:54 compute-0 python3.9[2727]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:49:54 compute-0 sudo[2725]: pam_unix(sudo:session): session closed for user root
Oct 08 18:49:55 compute-0 sudo[2848]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfamdfbxsxgnaixoxkbzivsekyhojpfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949393.6744804-65-9693660716769/AnsiballZ_copy.py'
Oct 08 18:49:55 compute-0 sudo[2848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:49:55 compute-0 python3.9[2850]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949393.6744804-65-9693660716769/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=1281659b32c29a1dc7e0f0a384629b6f851ea22a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:49:55 compute-0 sudo[2848]: pam_unix(sudo:session): session closed for user root
Oct 08 18:49:55 compute-0 sudo[3000]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrrizkvqvqbpbyqdqjytpgiuwwjkzqgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949395.0521512-109-122970405596331/AnsiballZ_file.py'
Oct 08 18:49:55 compute-0 sudo[3000]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:49:56 compute-0 python3.9[3002]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:49:56 compute-0 sudo[3000]: pam_unix(sudo:session): session closed for user root
Oct 08 18:49:56 compute-0 sudo[3152]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrcechmtmypukpunacpixvfputckmozl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949395.7800825-109-232606632486240/AnsiballZ_file.py'
Oct 08 18:49:56 compute-0 sudo[3152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:49:56 compute-0 python3.9[3154]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:49:56 compute-0 sudo[3152]: pam_unix(sudo:session): session closed for user root
Oct 08 18:49:57 compute-0 sudo[3304]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rirqogoisxdlchiogzfpmmzrkmawrynv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949396.6421862-124-137391660181529/AnsiballZ_stat.py'
Oct 08 18:49:57 compute-0 sudo[3304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:49:57 compute-0 python3.9[3306]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:49:57 compute-0 sudo[3304]: pam_unix(sudo:session): session closed for user root
Oct 08 18:49:58 compute-0 sudo[3427]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhplwvnpbasqspmbyuoqfahmonrsdiyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949396.6421862-124-137391660181529/AnsiballZ_copy.py'
Oct 08 18:49:58 compute-0 sudo[3427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:49:58 compute-0 python3.9[3429]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949396.6421862-124-137391660181529/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=7aeb83cb2a0f60617f03db5305a25fcf55b55f32 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:49:58 compute-0 sudo[3427]: pam_unix(sudo:session): session closed for user root
Oct 08 18:49:58 compute-0 sudo[3579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmfrfgiqglmeqlicfsbnoheaglobtssb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949398.0560448-124-156558864807952/AnsiballZ_stat.py'
Oct 08 18:49:58 compute-0 sudo[3579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:49:59 compute-0 python3.9[3581]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:49:59 compute-0 sudo[3579]: pam_unix(sudo:session): session closed for user root
Oct 08 18:49:59 compute-0 sudo[3702]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ouoziyxhnlmyajbfrngffwiyljjciwjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949398.0560448-124-156558864807952/AnsiballZ_copy.py'
Oct 08 18:49:59 compute-0 sudo[3702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:49:59 compute-0 python3.9[3704]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949398.0560448-124-156558864807952/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=c26f6e5193be2786a8b8bb5189b3cb6e2b9477fa backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:49:59 compute-0 sudo[3702]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:00 compute-0 systemd[1]: Starting system activity accounting tool...
Oct 08 18:50:00 compute-0 systemd[1]: sysstat-collect.service: Deactivated successfully.
Oct 08 18:50:00 compute-0 systemd[1]: Finished system activity accounting tool.
Oct 08 18:50:00 compute-0 sudo[3856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sigldbabjxujfyfshvhgaftdraarumql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949399.3501544-124-146296131647806/AnsiballZ_stat.py'
Oct 08 18:50:00 compute-0 sudo[3856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:00 compute-0 python3.9[3858]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:50:00 compute-0 sudo[3856]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:00 compute-0 sudo[3979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-guevjyptlphspxkanqleouixisonmmrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949399.3501544-124-146296131647806/AnsiballZ_copy.py'
Oct 08 18:50:00 compute-0 sudo[3979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:00 compute-0 python3.9[3981]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949399.3501544-124-146296131647806/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=bfbcf20cb82d842ee9779ffe8c4e17c06873b8ee backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:50:00 compute-0 sudo[3979]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:01 compute-0 sudo[4131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvmptcobqrrhblzuojythzbjpysqbonr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949400.7180948-168-20286777769662/AnsiballZ_file.py'
Oct 08 18:50:01 compute-0 sudo[4131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:01 compute-0 python3.9[4133]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:50:01 compute-0 sudo[4131]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:02 compute-0 sudo[4283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmzfspkenwyoywfybjmmwyhzsongtlnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949401.4080162-168-208237420992275/AnsiballZ_file.py'
Oct 08 18:50:02 compute-0 sudo[4283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:02 compute-0 python3.9[4285]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:50:02 compute-0 sudo[4283]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:03 compute-0 sudo[4435]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdfybjvkokefjsrsxqbeqssguzvkcvhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949402.181615-183-38451611957925/AnsiballZ_stat.py'
Oct 08 18:50:03 compute-0 sudo[4435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:03 compute-0 python3.9[4437]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:50:03 compute-0 sudo[4435]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:03 compute-0 sudo[4558]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlqoevwbpmrgvildvrnhremiiqcxxerg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949402.181615-183-38451611957925/AnsiballZ_copy.py'
Oct 08 18:50:03 compute-0 sudo[4558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:03 compute-0 python3.9[4560]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949402.181615-183-38451611957925/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=02896da65fce18a168e7910f5af7526a79efb8fe backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:50:03 compute-0 sudo[4558]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:04 compute-0 sudo[4710]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rovaslhttnovdmfgjhqwudgwprehcpre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949403.507737-183-53996794758040/AnsiballZ_stat.py'
Oct 08 18:50:04 compute-0 sudo[4710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:04 compute-0 python3.9[4712]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:50:04 compute-0 sudo[4710]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:04 compute-0 sudo[4833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zcthemrazbbzdfwtxtjdrmbkemwwbeij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949403.507737-183-53996794758040/AnsiballZ_copy.py'
Oct 08 18:50:04 compute-0 sudo[4833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:05 compute-0 python3.9[4835]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949403.507737-183-53996794758040/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=dbbb1b6fdf947d6757e9a97b4db0f0fed02fc7bb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:50:05 compute-0 sudo[4833]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:05 compute-0 sudo[4985]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqwjytwbrnshsgpnzgwjoovqpgyendxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949404.8134964-183-94886104920985/AnsiballZ_stat.py'
Oct 08 18:50:05 compute-0 sudo[4985]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:05 compute-0 python3.9[4987]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:50:05 compute-0 sudo[4985]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:06 compute-0 sudo[5108]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twgokbizrniloophonrsczkadzqwarcr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949404.8134964-183-94886104920985/AnsiballZ_copy.py'
Oct 08 18:50:06 compute-0 sudo[5108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:06 compute-0 python3.9[5110]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949404.8134964-183-94886104920985/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=0a8ddb07cb526effa9222003433072d0e2469723 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:50:06 compute-0 sudo[5108]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:07 compute-0 sudo[5260]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkfzbrsjcievwqfmynueppdimlrjtcum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949406.3089054-227-93212311330620/AnsiballZ_file.py'
Oct 08 18:50:07 compute-0 sudo[5260]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:07 compute-0 python3.9[5262]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:50:07 compute-0 sudo[5260]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:07 compute-0 sudo[5412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgttxdilplmkgwisvltrlrasxpkpfaym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949407.0584092-227-26132725835366/AnsiballZ_file.py'
Oct 08 18:50:07 compute-0 sudo[5412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:08 compute-0 python3.9[5414]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:50:08 compute-0 sudo[5412]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:08 compute-0 sudo[5564]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytveyaakxdgewylxxxlkpeqqfivjrhai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949407.829225-242-126511778725729/AnsiballZ_stat.py'
Oct 08 18:50:08 compute-0 sudo[5564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:08 compute-0 python3.9[5566]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:50:08 compute-0 sudo[5564]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:09 compute-0 sudo[5687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewsyhjebkarcflhjcmvtvsnihaepmwvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949407.829225-242-126511778725729/AnsiballZ_copy.py'
Oct 08 18:50:09 compute-0 sudo[5687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:09 compute-0 python3.9[5689]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949407.829225-242-126511778725729/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=e59bd502e03ac84f245d8f11272237c8e4c2985c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:50:09 compute-0 sudo[5687]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:09 compute-0 sudo[5839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdxnuhtchemuojxpmmccxsarizggtfjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949409.1662827-242-75650193509197/AnsiballZ_stat.py'
Oct 08 18:50:09 compute-0 sudo[5839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:10 compute-0 python3.9[5841]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:50:10 compute-0 sudo[5839]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:10 compute-0 sudo[5962]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmupvfdvwbjbcbfurnzsielebolfvrjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949409.1662827-242-75650193509197/AnsiballZ_copy.py'
Oct 08 18:50:10 compute-0 sudo[5962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:10 compute-0 python3.9[5964]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949409.1662827-242-75650193509197/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=dbbb1b6fdf947d6757e9a97b4db0f0fed02fc7bb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:50:10 compute-0 sudo[5962]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:11 compute-0 sudo[6114]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqqmrewvuhnmjfsbyhygjskufipvbvlh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949410.5229716-242-213434339985290/AnsiballZ_stat.py'
Oct 08 18:50:11 compute-0 sudo[6114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:11 compute-0 python3.9[6116]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:50:11 compute-0 sudo[6114]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:11 compute-0 sudo[6237]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqbmsutpolcgftrstlyanvtggvdfrkzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949410.5229716-242-213434339985290/AnsiballZ_copy.py'
Oct 08 18:50:11 compute-0 sudo[6237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:12 compute-0 python3.9[6239]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949410.5229716-242-213434339985290/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=bdb025fff65dd78b3338c3645348e6625a6566d3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:50:12 compute-0 sudo[6237]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:13 compute-0 sudo[6389]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqjfcfzitlqqcxmkfswitpeagvjvxwla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949412.291562-302-83557335664452/AnsiballZ_file.py'
Oct 08 18:50:13 compute-0 sudo[6389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:13 compute-0 python3.9[6391]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:50:13 compute-0 sudo[6389]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:13 compute-0 sudo[6541]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzehcxheyhgmcqowyqszhqlpkewvytze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949412.9994059-310-123677732071107/AnsiballZ_stat.py'
Oct 08 18:50:13 compute-0 sudo[6541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:14 compute-0 python3.9[6543]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:50:14 compute-0 sudo[6541]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:14 compute-0 sudo[6664]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wabarlszzqgggfterqhpafomggynwczt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949412.9994059-310-123677732071107/AnsiballZ_copy.py'
Oct 08 18:50:14 compute-0 sudo[6664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:14 compute-0 python3.9[6666]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949412.9994059-310-123677732071107/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=77fff1c6b7e11d6b8bf60629262eb6aa0aa1c835 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:50:14 compute-0 sudo[6664]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:15 compute-0 sudo[6816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfznflgmrdmnrrqcphzktveqklxnizlp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949414.4679043-326-220414865799599/AnsiballZ_file.py'
Oct 08 18:50:15 compute-0 sudo[6816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:15 compute-0 python3.9[6818]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:50:15 compute-0 sudo[6816]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:15 compute-0 sudo[6968]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzhxczsvsnlzzqbbmfpqhmbpibitsszs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949415.107704-334-53778534495422/AnsiballZ_stat.py'
Oct 08 18:50:15 compute-0 sudo[6968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:16 compute-0 python3.9[6970]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:50:16 compute-0 sudo[6968]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:16 compute-0 sudo[7091]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmfsxwnukyvqjphcawgklwmcksczpcaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949415.107704-334-53778534495422/AnsiballZ_copy.py'
Oct 08 18:50:16 compute-0 sudo[7091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:16 compute-0 python3.9[7093]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949415.107704-334-53778534495422/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=77fff1c6b7e11d6b8bf60629262eb6aa0aa1c835 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:50:16 compute-0 sudo[7091]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:17 compute-0 sudo[7243]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvcgjdhrpjhbouaogvrqdradslcyhclu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949416.394674-350-98010957052043/AnsiballZ_file.py'
Oct 08 18:50:17 compute-0 sudo[7243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:17 compute-0 python3.9[7245]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:50:17 compute-0 sudo[7243]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:17 compute-0 sudo[7395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnnfiecbwvihuixnposmgfoojzjmmekc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949417.1053324-358-47357061822012/AnsiballZ_stat.py'
Oct 08 18:50:17 compute-0 sudo[7395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:18 compute-0 python3.9[7397]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:50:18 compute-0 sudo[7395]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:18 compute-0 sudo[7518]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcmdthtmutmvgncqsdykfgqiywaqanqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949417.1053324-358-47357061822012/AnsiballZ_copy.py'
Oct 08 18:50:18 compute-0 sudo[7518]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:18 compute-0 python3.9[7520]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949417.1053324-358-47357061822012/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=77fff1c6b7e11d6b8bf60629262eb6aa0aa1c835 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:50:18 compute-0 sudo[7518]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:19 compute-0 sudo[7670]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zysalxfzhayuxoiqclqwkhvqqplelfce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949418.5166404-374-196668691482334/AnsiballZ_file.py'
Oct 08 18:50:19 compute-0 sudo[7670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:19 compute-0 python3.9[7672]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:50:19 compute-0 sudo[7670]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:20 compute-0 sudo[7822]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbzypirdgzcygensbjeixudaxnsjhmou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949419.272474-382-10723025164268/AnsiballZ_stat.py'
Oct 08 18:50:20 compute-0 sudo[7822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:20 compute-0 python3.9[7824]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:50:20 compute-0 sudo[7822]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:20 compute-0 sudo[7945]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjldbsusrzbeqkrlphwxewpfglwabkpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949419.272474-382-10723025164268/AnsiballZ_copy.py'
Oct 08 18:50:20 compute-0 sudo[7945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:20 compute-0 python3.9[7947]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949419.272474-382-10723025164268/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=77fff1c6b7e11d6b8bf60629262eb6aa0aa1c835 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:50:20 compute-0 sudo[7945]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:21 compute-0 sudo[8097]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rknvxgydngregugdnroiotfqxmfzklyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949420.7091587-398-252228535550368/AnsiballZ_file.py'
Oct 08 18:50:21 compute-0 sudo[8097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:21 compute-0 python3.9[8099]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:50:21 compute-0 sudo[8097]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:22 compute-0 sudo[8249]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdmfazscotsfdvsjkcknymhqxrycugbt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949421.4911299-406-205720063418029/AnsiballZ_stat.py'
Oct 08 18:50:22 compute-0 sudo[8249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:22 compute-0 python3.9[8251]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:50:22 compute-0 sudo[8249]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:22 compute-0 sudo[8372]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubjobpsrrwmkbdesqvwugsqfodangqzb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949421.4911299-406-205720063418029/AnsiballZ_copy.py'
Oct 08 18:50:22 compute-0 sudo[8372]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:23 compute-0 python3.9[8374]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949421.4911299-406-205720063418029/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=77fff1c6b7e11d6b8bf60629262eb6aa0aa1c835 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:50:23 compute-0 sudo[8372]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:23 compute-0 sudo[8524]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddcfeymbpnpppmchmuexipeiqmhtpqim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949422.9085033-422-119366901928405/AnsiballZ_file.py'
Oct 08 18:50:23 compute-0 sudo[8524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:23 compute-0 python3.9[8526]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:50:23 compute-0 sudo[8524]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:24 compute-0 sudo[8676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bltfcqnzxdkydjflrhzqhinujiqzuzcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949423.6835752-430-106338857731098/AnsiballZ_stat.py'
Oct 08 18:50:24 compute-0 sudo[8676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:24 compute-0 python3.9[8678]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:50:24 compute-0 sudo[8676]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:25 compute-0 sudo[8799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yofbquyhyqdiacrlkrgtexunjukeeckv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949423.6835752-430-106338857731098/AnsiballZ_copy.py'
Oct 08 18:50:25 compute-0 sudo[8799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:25 compute-0 python3.9[8801]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949423.6835752-430-106338857731098/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=77fff1c6b7e11d6b8bf60629262eb6aa0aa1c835 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:50:25 compute-0 sudo[8799]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:25 compute-0 sudo[8951]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hazeryljocpnmrfuinflspwvvecbrtza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949425.1990786-446-149542652766675/AnsiballZ_file.py'
Oct 08 18:50:25 compute-0 sudo[8951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:26 compute-0 python3.9[8953]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:50:26 compute-0 sudo[8951]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:26 compute-0 sudo[9103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpajdftsixxsisxogiythrzahdphmfxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949425.9541163-454-52092794050268/AnsiballZ_stat.py'
Oct 08 18:50:26 compute-0 sudo[9103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:26 compute-0 python3.9[9105]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:50:27 compute-0 sudo[9103]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:27 compute-0 sudo[9226]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eagqtfqzrzjoswrsuipdiiokeftfxcch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949425.9541163-454-52092794050268/AnsiballZ_copy.py'
Oct 08 18:50:27 compute-0 sudo[9226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:27 compute-0 python3.9[9228]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949425.9541163-454-52092794050268/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=77fff1c6b7e11d6b8bf60629262eb6aa0aa1c835 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:50:27 compute-0 sudo[9226]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:28 compute-0 sshd-session[1567]: Connection closed by 192.168.122.30 port 46310
Oct 08 18:50:28 compute-0 sshd-session[1564]: pam_unix(sshd:session): session closed for user zuul
Oct 08 18:50:28 compute-0 systemd[1]: session-3.scope: Deactivated successfully.
Oct 08 18:50:28 compute-0 systemd[1]: session-3.scope: Consumed 32.973s CPU time.
Oct 08 18:50:28 compute-0 systemd-logind[844]: Session 3 logged out. Waiting for processes to exit.
Oct 08 18:50:28 compute-0 systemd-logind[844]: Removed session 3.
Oct 08 18:50:33 compute-0 sshd-session[9253]: Accepted publickey for zuul from 192.168.122.30 port 34034 ssh2: ECDSA SHA256:i+73Mx2Y/ukt1b+huf+9w+ftZalnyybbDU6glTR0JfU
Oct 08 18:50:33 compute-0 systemd-logind[844]: New session 4 of user zuul.
Oct 08 18:50:33 compute-0 systemd[1]: Started Session 4 of User zuul.
Oct 08 18:50:33 compute-0 sshd-session[9253]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 08 18:50:35 compute-0 python3.9[9406]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 08 18:50:35 compute-0 sudo[9560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbryxgpyclxliowoabkhrbdoxyllpeaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949435.0048182-34-264019206994476/AnsiballZ_file.py'
Oct 08 18:50:35 compute-0 sudo[9560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:36 compute-0 python3.9[9562]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:50:36 compute-0 sudo[9560]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:36 compute-0 sudo[9712]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvtyrbwldtgvezodnzrbekjqktxxelgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949435.8342502-34-249696199663948/AnsiballZ_file.py'
Oct 08 18:50:36 compute-0 sudo[9712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:36 compute-0 python3.9[9714]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:50:36 compute-0 sudo[9712]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:37 compute-0 python3.9[9864]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 08 18:50:38 compute-0 sudo[10014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxcouhxnobsvnxmtkniqnzbkasdfmbhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949437.344911-57-120370946683142/AnsiballZ_seboolean.py'
Oct 08 18:50:38 compute-0 sudo[10014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:38 compute-0 python3.9[10016]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Oct 08 18:50:42 compute-0 sudo[10014]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:42 compute-0 sudo[10170]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmfkjlluuvkhwozywbszalhwuveejeta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949441.7753584-67-40355878834869/AnsiballZ_setup.py'
Oct 08 18:50:42 compute-0 dbus-broker-launch[836]: avc:  op=load_policy lsm=selinux seqno=2 res=1
Oct 08 18:50:42 compute-0 sudo[10170]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:42 compute-0 python3.9[10172]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 08 18:50:43 compute-0 sudo[10170]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:43 compute-0 sudo[10254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcqbylhsufpcvxfhjenyrtrbihcapfpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949441.7753584-67-40355878834869/AnsiballZ_dnf.py'
Oct 08 18:50:43 compute-0 sudo[10254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:43 compute-0 python3.9[10256]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 08 18:50:46 compute-0 sudo[10254]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:47 compute-0 sudo[10407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gyqiwhrkimgkubrxinjgtcwdfvcfrxgu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949446.2516468-79-20615080136116/AnsiballZ_systemd.py'
Oct 08 18:50:47 compute-0 sudo[10407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:47 compute-0 python3.9[10409]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 08 18:50:47 compute-0 sudo[10407]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:48 compute-0 sudo[10562]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmqbejblvxvyxvpqwxqveggnsfeokgoo ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759949447.5782015-87-273375253828526/AnsiballZ_edpm_nftables_snippet.py'
Oct 08 18:50:48 compute-0 sudo[10562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:48 compute-0 python3[10564]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                            rule:
                                              proto: udp
                                              dport: 4789
                                          - rule_name: 119 neutron geneve networks
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              state: ["UNTRACKED"]
                                          - rule_name: 120 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: OUTPUT
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                          - rule_name: 121 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: PREROUTING
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                           dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Oct 08 18:50:48 compute-0 sudo[10562]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:49 compute-0 sudo[10714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddpkhydqpjootitddjjqsbasknumlmll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949448.6050694-96-113703379519146/AnsiballZ_file.py'
Oct 08 18:50:49 compute-0 sudo[10714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:49 compute-0 python3.9[10716]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:50:49 compute-0 sudo[10714]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:50 compute-0 sudo[10866]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lubceakxyqpkurmimitujhcbwtzcnsrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949449.258614-104-249701935489664/AnsiballZ_stat.py'
Oct 08 18:50:50 compute-0 sudo[10866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:50 compute-0 python3.9[10868]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:50:50 compute-0 sudo[10866]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:50 compute-0 sudo[10944]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzfprymfqebsjuutvhyrkqwsiohjsfoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949449.258614-104-249701935489664/AnsiballZ_file.py'
Oct 08 18:50:50 compute-0 sudo[10944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:50 compute-0 python3.9[10946]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:50:50 compute-0 sudo[10944]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:51 compute-0 sudo[11096]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bukozdclkeeipqmnpypmyprtkttyhmbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949450.612478-116-104839182100897/AnsiballZ_stat.py'
Oct 08 18:50:51 compute-0 sudo[11096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:51 compute-0 python3.9[11098]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:50:51 compute-0 sudo[11096]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:51 compute-0 sudo[11174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjegaiffidaagmahbqepjizrypdbpdqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949450.612478-116-104839182100897/AnsiballZ_file.py'
Oct 08 18:50:51 compute-0 sudo[11174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:52 compute-0 python3.9[11176]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.3chpsua9 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:50:52 compute-0 sudo[11174]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:52 compute-0 sudo[11326]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjopmmvviueicvbwoqgktxyefoydxkje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949451.7296188-128-110785741482410/AnsiballZ_stat.py'
Oct 08 18:50:52 compute-0 sudo[11326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:52 compute-0 python3.9[11328]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:50:52 compute-0 sudo[11326]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:52 compute-0 sudo[11404]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wyorpxqnlwbytbdvjgouvbtjckvjbhah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949451.7296188-128-110785741482410/AnsiballZ_file.py'
Oct 08 18:50:52 compute-0 sudo[11404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:53 compute-0 python3.9[11406]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:50:53 compute-0 sudo[11404]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:53 compute-0 sudo[11556]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqkrihylebwtwmflaiueiibwigvnzuvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949452.921123-141-272269876664459/AnsiballZ_command.py'
Oct 08 18:50:53 compute-0 sudo[11556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:54 compute-0 python3.9[11558]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 08 18:50:54 compute-0 sudo[11556]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:54 compute-0 sudo[11709]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oteellihvqfqpuoaxpgwggtclxgkxuis ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759949453.7633247-149-134400314792539/AnsiballZ_edpm_nftables_from_files.py'
Oct 08 18:50:54 compute-0 sudo[11709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:54 compute-0 python3[11711]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct 08 18:50:54 compute-0 sudo[11709]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:55 compute-0 sudo[11861]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enpvnjxzzqzkuiznernphsntwjkctbjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949454.6626768-157-214285420327743/AnsiballZ_stat.py'
Oct 08 18:50:55 compute-0 sudo[11861]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:55 compute-0 python3.9[11863]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:50:55 compute-0 sudo[11861]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:56 compute-0 sudo[11986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-viydrtqjhzmlptkiwoeibcfafvufosqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949454.6626768-157-214285420327743/AnsiballZ_copy.py'
Oct 08 18:50:56 compute-0 sudo[11986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:56 compute-0 python3.9[11988]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949454.6626768-157-214285420327743/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:50:56 compute-0 sudo[11986]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:57 compute-0 sudo[12138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbmoaqxpermhebuisaqznfreygczpgmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949456.2449355-172-183391960342928/AnsiballZ_stat.py'
Oct 08 18:50:57 compute-0 sudo[12138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:57 compute-0 python3.9[12140]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:50:57 compute-0 sudo[12138]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:57 compute-0 sudo[12263]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifjldupllbhdbcvoxxrvxhiauloajbqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949456.2449355-172-183391960342928/AnsiballZ_copy.py'
Oct 08 18:50:57 compute-0 sudo[12263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:57 compute-0 python3.9[12265]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949456.2449355-172-183391960342928/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:50:57 compute-0 sudo[12263]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:58 compute-0 sudo[12415]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbtmbzmbequywvdhtmcxqtbxjkhmbjaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949457.5821939-187-181303124878137/AnsiballZ_stat.py'
Oct 08 18:50:58 compute-0 sudo[12415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:58 compute-0 python3.9[12417]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:50:58 compute-0 sudo[12415]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:59 compute-0 sudo[12540]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgxrguzfwwvajuyhaqyqkqjdddchxxhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949457.5821939-187-181303124878137/AnsiballZ_copy.py'
Oct 08 18:50:59 compute-0 sudo[12540]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:59 compute-0 python3.9[12542]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949457.5821939-187-181303124878137/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:50:59 compute-0 sudo[12540]: pam_unix(sudo:session): session closed for user root
Oct 08 18:50:59 compute-0 sudo[12692]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqvkchnenorgjvokzegdldaiqokzeohd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949458.9373817-202-41985063136530/AnsiballZ_stat.py'
Oct 08 18:50:59 compute-0 sudo[12692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:50:59 compute-0 python3.9[12694]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:50:59 compute-0 sudo[12692]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:00 compute-0 sudo[12817]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kewatsboqputxangxpvqqpvtctmrfebq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949458.9373817-202-41985063136530/AnsiballZ_copy.py'
Oct 08 18:51:00 compute-0 sudo[12817]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:00 compute-0 python3.9[12819]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949458.9373817-202-41985063136530/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:51:00 compute-0 sudo[12817]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:01 compute-0 sudo[12969]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttaubsgvtejwzvcmpojhqfhxnmntqpzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949460.2715518-217-258282885353684/AnsiballZ_stat.py'
Oct 08 18:51:01 compute-0 sudo[12969]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:01 compute-0 python3.9[12971]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:51:01 compute-0 sudo[12969]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:01 compute-0 sudo[13094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njjabgubwpseiudyjtdcnwtxgxjsvrgm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949460.2715518-217-258282885353684/AnsiballZ_copy.py'
Oct 08 18:51:01 compute-0 sudo[13094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:01 compute-0 python3.9[13096]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949460.2715518-217-258282885353684/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:51:01 compute-0 sudo[13094]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:02 compute-0 sudo[13246]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjrejxavioqiqavokdakrmsvecopfwdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949461.691022-232-194004078338800/AnsiballZ_file.py'
Oct 08 18:51:02 compute-0 sudo[13246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:02 compute-0 python3.9[13248]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:51:02 compute-0 sudo[13246]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:03 compute-0 sudo[13398]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnfupzpotdfvblxicuythmxycuzyrtmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949462.3151777-240-145084452676289/AnsiballZ_command.py'
Oct 08 18:51:03 compute-0 sudo[13398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:03 compute-0 python3.9[13400]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 08 18:51:03 compute-0 sudo[13398]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:04 compute-0 sudo[13553]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhoclcljxpfdoscweldvlyohrqlxoflz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949463.112179-248-68795392172654/AnsiballZ_blockinfile.py'
Oct 08 18:51:04 compute-0 sudo[13553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:04 compute-0 python3.9[13555]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:51:04 compute-0 sudo[13553]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:04 compute-0 sudo[13705]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxvoeexdyqgqmybvabxhaommdndwjpvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949464.133578-257-160355922597703/AnsiballZ_command.py'
Oct 08 18:51:04 compute-0 sudo[13705]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:05 compute-0 python3.9[13707]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 08 18:51:05 compute-0 sudo[13705]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:05 compute-0 sudo[13858]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptwytlrnzvrantzjzqisiiucrjnvwqkl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949464.8837636-265-206926835567234/AnsiballZ_stat.py'
Oct 08 18:51:05 compute-0 sudo[13858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:05 compute-0 python3.9[13860]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 08 18:51:05 compute-0 sudo[13858]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:06 compute-0 sudo[14012]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebdiahvmdnweweazxqquyczhxpbhihfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949465.6737638-273-157417143406209/AnsiballZ_command.py'
Oct 08 18:51:06 compute-0 sudo[14012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:06 compute-0 python3.9[14014]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 08 18:51:06 compute-0 sudo[14012]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:07 compute-0 sudo[14167]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jiidurvkizfauorxsolwrmoixlpbesiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949466.4484897-281-225242969668761/AnsiballZ_file.py'
Oct 08 18:51:07 compute-0 sudo[14167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:07 compute-0 python3.9[14169]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:51:07 compute-0 sudo[14167]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:08 compute-0 python3.9[14319]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 08 18:51:09 compute-0 sudo[14470]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljwhqltwoxjznwhenybgsqqohdsigjrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949468.717806-321-262728863731898/AnsiballZ_command.py'
Oct 08 18:51:09 compute-0 sudo[14470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:09 compute-0 python3.9[14472]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:0e:0a:d8:76:c8:90" external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 08 18:51:09 compute-0 ovs-vsctl[14473]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:0e:0a:d8:76:c8:90 external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Oct 08 18:51:09 compute-0 sudo[14470]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:10 compute-0 sudo[14623]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwvlkkoqaxmkjdrcpkeealgrqlvicncs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949469.44788-330-109681983431720/AnsiballZ_command.py'
Oct 08 18:51:10 compute-0 sudo[14623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:10 compute-0 python3.9[14625]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                            ovs-vsctl show | grep -q "Manager"
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 08 18:51:10 compute-0 sudo[14623]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:11 compute-0 sudo[14778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsyfhhczrrdbrypgqgvggsnfhjbldyqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949470.330557-338-146685574217337/AnsiballZ_command.py'
Oct 08 18:51:11 compute-0 sudo[14778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:11 compute-0 python3.9[14780]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 08 18:51:11 compute-0 ovs-vsctl[14781]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Oct 08 18:51:11 compute-0 sudo[14778]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:12 compute-0 python3.9[14931]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 08 18:51:12 compute-0 sudo[15083]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-benguehhqqtckxspxjyuiohvtqttsdhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949471.8562953-355-49701318375415/AnsiballZ_file.py'
Oct 08 18:51:12 compute-0 sudo[15083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:12 compute-0 python3.9[15085]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:51:12 compute-0 sudo[15083]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:13 compute-0 sudo[15235]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqyzsrmeyvvfkecatlwwfzvsfmmkfkua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949472.703947-363-209018489251255/AnsiballZ_stat.py'
Oct 08 18:51:13 compute-0 sudo[15235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:13 compute-0 python3.9[15237]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:51:13 compute-0 sudo[15235]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:14 compute-0 sudo[15313]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubhkvblfhpkicojkixulnlusychedulo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949472.703947-363-209018489251255/AnsiballZ_file.py'
Oct 08 18:51:14 compute-0 sudo[15313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:14 compute-0 python3.9[15315]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:51:14 compute-0 sudo[15313]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:14 compute-0 sudo[15465]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdqffjarlbeacstgqxoxlkjprqegdhcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949473.9006665-363-148878944874037/AnsiballZ_stat.py'
Oct 08 18:51:14 compute-0 sudo[15465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:14 compute-0 python3.9[15467]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:51:14 compute-0 sudo[15465]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:15 compute-0 sudo[15543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfpqeruhkzqmfjydkjclfthvxxgmlbet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949473.9006665-363-148878944874037/AnsiballZ_file.py'
Oct 08 18:51:15 compute-0 sudo[15543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:15 compute-0 python3.9[15545]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:51:15 compute-0 sudo[15543]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:15 compute-0 sudo[15695]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhmfcibvxbinjkiaiiuhydbqsoteeylc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949475.0613928-386-52449463560079/AnsiballZ_file.py'
Oct 08 18:51:15 compute-0 sudo[15695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:16 compute-0 python3.9[15697]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:51:16 compute-0 sudo[15695]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:16 compute-0 sudo[15847]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjqwejlnyuxesjbmbjnsbnwxhsxykjep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949475.7911072-394-280389150988250/AnsiballZ_stat.py'
Oct 08 18:51:16 compute-0 sudo[15847]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:16 compute-0 python3.9[15849]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:51:16 compute-0 sudo[15847]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:17 compute-0 sudo[15925]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkxekuelierburpalkjvbjqzflpnsdgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949475.7911072-394-280389150988250/AnsiballZ_file.py'
Oct 08 18:51:17 compute-0 sudo[15925]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:17 compute-0 python3.9[15927]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:51:17 compute-0 sudo[15925]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:17 compute-0 sudo[16077]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-celmoslpoismllfwosmdpsbmgzfkkbvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949477.0813558-406-86023339226744/AnsiballZ_stat.py'
Oct 08 18:51:17 compute-0 sudo[16077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:18 compute-0 python3.9[16079]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:51:18 compute-0 sudo[16077]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:18 compute-0 sudo[16155]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llplhizuyxeufhkfbtyjcwgszbejfbna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949477.0813558-406-86023339226744/AnsiballZ_file.py'
Oct 08 18:51:18 compute-0 sudo[16155]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:18 compute-0 python3.9[16157]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:51:18 compute-0 sudo[16155]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:18 compute-0 chronyd[850]: Selected source 162.159.200.1 (pool.ntp.org)
Oct 08 18:51:19 compute-0 sudo[16307]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzjgdxtydgwzdemcwbzyystnzsolvmgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949478.4442685-418-135260137520486/AnsiballZ_systemd.py'
Oct 08 18:51:19 compute-0 sudo[16307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:19 compute-0 python3.9[16309]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 08 18:51:19 compute-0 systemd[1]: Reloading.
Oct 08 18:51:19 compute-0 systemd-rc-local-generator[16336]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 18:51:19 compute-0 systemd-sysv-generator[16339]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 18:51:19 compute-0 sudo[16307]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:20 compute-0 sudo[16495]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzxqbssbeuppixgqshnchlikkkiwxihi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949479.687853-426-147822407868964/AnsiballZ_stat.py'
Oct 08 18:51:20 compute-0 sudo[16495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:20 compute-0 python3.9[16497]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:51:20 compute-0 sudo[16495]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:20 compute-0 sudo[16573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cokuplovnvmjpcvmeaaqxxssfthvwbzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949479.687853-426-147822407868964/AnsiballZ_file.py'
Oct 08 18:51:20 compute-0 sudo[16573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:21 compute-0 python3.9[16575]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:51:21 compute-0 sudo[16573]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:21 compute-0 sudo[16725]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-suczwbtwqquguubbkkipxuvyyyabagbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949480.9547086-438-114965001485523/AnsiballZ_stat.py'
Oct 08 18:51:21 compute-0 sudo[16725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:21 compute-0 python3.9[16727]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:51:21 compute-0 sudo[16725]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:22 compute-0 sudo[16803]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqfaggurkznjrhttigchgxbtvhwmzxrx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949480.9547086-438-114965001485523/AnsiballZ_file.py'
Oct 08 18:51:22 compute-0 sudo[16803]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:22 compute-0 python3.9[16805]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:51:22 compute-0 sudo[16803]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:22 compute-0 sudo[16955]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tcsxkhepoqhshjzhpwoieouddemehfyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949482.3414345-450-11197046797099/AnsiballZ_systemd.py'
Oct 08 18:51:22 compute-0 sudo[16955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:23 compute-0 python3.9[16957]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 08 18:51:23 compute-0 systemd[1]: Reloading.
Oct 08 18:51:23 compute-0 systemd-rc-local-generator[16976]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 18:51:23 compute-0 systemd-sysv-generator[16983]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 18:51:23 compute-0 systemd[1]: Starting Create netns directory...
Oct 08 18:51:23 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 08 18:51:23 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 08 18:51:23 compute-0 systemd[1]: Finished Create netns directory.
Oct 08 18:51:23 compute-0 sudo[16955]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:24 compute-0 sudo[17148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-huufgaxeqxwbxglinlgqignqkwyvdouo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949483.7328854-460-182811271165729/AnsiballZ_file.py'
Oct 08 18:51:24 compute-0 sudo[17148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:24 compute-0 python3.9[17150]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:51:24 compute-0 sudo[17148]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:24 compute-0 sudo[17300]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipwbqmabefeqlwnbunvpkreagampwhwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949484.4718456-468-184137831269943/AnsiballZ_stat.py'
Oct 08 18:51:24 compute-0 sudo[17300]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:25 compute-0 python3.9[17302]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:51:25 compute-0 sudo[17300]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:25 compute-0 sudo[17423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwxkmxorkzghycolnfkdlgqdzzqoylzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949484.4718456-468-184137831269943/AnsiballZ_copy.py'
Oct 08 18:51:25 compute-0 sudo[17423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:25 compute-0 python3.9[17425]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759949484.4718456-468-184137831269943/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:51:25 compute-0 sudo[17423]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:26 compute-0 sudo[17575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbrhsxipdsyjrjbhsrmaswuntkwoirru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949486.0181575-485-234957481411888/AnsiballZ_file.py'
Oct 08 18:51:26 compute-0 sudo[17575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:26 compute-0 python3.9[17577]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:51:26 compute-0 sudo[17575]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:27 compute-0 sudo[17727]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-daeulimuauchfoeuqyftvxwfobicuehd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949486.7217233-493-125235074874130/AnsiballZ_stat.py'
Oct 08 18:51:27 compute-0 sudo[17727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:27 compute-0 python3.9[17729]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:51:27 compute-0 sudo[17727]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:27 compute-0 sudo[17850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-joazcikcturroiraxezhzhnsvqtlchev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949486.7217233-493-125235074874130/AnsiballZ_copy.py'
Oct 08 18:51:27 compute-0 sudo[17850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:27 compute-0 python3.9[17852]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759949486.7217233-493-125235074874130/.source.json _original_basename=.80d_off9 follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:51:27 compute-0 sudo[17850]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:28 compute-0 sudo[18002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txlwrcbevozvxmceyoljuuarfocmgbxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949488.0262165-508-209226138438866/AnsiballZ_file.py'
Oct 08 18:51:28 compute-0 sudo[18002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:28 compute-0 python3.9[18004]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:51:28 compute-0 sudo[18002]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:29 compute-0 sudo[18154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjkidayadmdffujswowegwhskcsvobym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949488.7761495-516-218682146242199/AnsiballZ_stat.py'
Oct 08 18:51:29 compute-0 sudo[18154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:29 compute-0 sudo[18154]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:29 compute-0 sudo[18277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwfcyohqpulysjpdkdmpvqdjdqjsjzgk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949488.7761495-516-218682146242199/AnsiballZ_copy.py'
Oct 08 18:51:29 compute-0 sudo[18277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:29 compute-0 sudo[18277]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:30 compute-0 sudo[18429]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iziliufgkeymczssaceyhfjldqkdxbpj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949490.239596-533-137971054246621/AnsiballZ_container_config_data.py'
Oct 08 18:51:30 compute-0 sudo[18429]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:30 compute-0 python3.9[18431]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Oct 08 18:51:30 compute-0 sudo[18429]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:31 compute-0 sudo[18581]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iccgwjqaghyhvdpqywdvsdjqzwbggrrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949491.1750772-542-76683175476762/AnsiballZ_container_config_hash.py'
Oct 08 18:51:31 compute-0 sudo[18581]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:31 compute-0 python3.9[18583]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 08 18:51:31 compute-0 sudo[18581]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:32 compute-0 sudo[18733]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csowqmqrylujrytemejgerzlhrbwfrsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949492.174064-551-153575526430826/AnsiballZ_podman_container_info.py'
Oct 08 18:51:32 compute-0 sudo[18733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:32 compute-0 python3.9[18735]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct 08 18:51:33 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat3189792475-merged.mount: Deactivated successfully.
Oct 08 18:51:33 compute-0 kernel: evm: overlay not supported
Oct 08 18:51:33 compute-0 podman[18736]: 2025-10-08 18:51:33.16962068 +0000 UTC m=+0.204498082 system refresh
Oct 08 18:51:33 compute-0 sudo[18733]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:34 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 08 18:51:34 compute-0 sudo[18902]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdjjceiyunarfcrndqratdemcxrcjxpk ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759949493.6509829-564-166542739221845/AnsiballZ_edpm_container_manage.py'
Oct 08 18:51:34 compute-0 sudo[18902]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:34 compute-0 python3[18904]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct 08 18:51:34 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 08 18:51:34 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 08 18:51:34 compute-0 podman[18940]: 2025-10-08 18:51:34.674681157 +0000 UTC m=+0.026348267 image pull 70c92fb64e1eda6ef063d34e60e9a541e44edbaa51e757e8304331202c76a3a7 quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857
Oct 08 18:51:34 compute-0 podman[18940]: 2025-10-08 18:51:34.817700723 +0000 UTC m=+0.169367783 container create 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Oct 08 18:51:34 compute-0 python3[18904]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857
Oct 08 18:51:34 compute-0 sudo[18902]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:35 compute-0 sudo[19132]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zeppcqaeasdyamkbhfdilcsuwdenorqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949495.1823523-572-115160001004988/AnsiballZ_stat.py'
Oct 08 18:51:35 compute-0 sudo[19132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:35 compute-0 python3.9[19134]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 08 18:51:35 compute-0 sudo[19132]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:36 compute-0 sudo[19286]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvfjiovajbieacknxlmqqbocdkcbnscy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949495.98938-581-132488540058135/AnsiballZ_file.py'
Oct 08 18:51:36 compute-0 sudo[19286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:36 compute-0 python3.9[19288]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:51:36 compute-0 sudo[19286]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:36 compute-0 sudo[19362]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftlfdxbeelxfhmzjppfxrjydlhvvbscu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949495.98938-581-132488540058135/AnsiballZ_stat.py'
Oct 08 18:51:36 compute-0 sudo[19362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:37 compute-0 python3.9[19364]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 08 18:51:37 compute-0 sudo[19362]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:37 compute-0 sudo[19513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkdgmxvghzispsohsylqlbduthsiuvqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949497.1920602-581-162758505078983/AnsiballZ_copy.py'
Oct 08 18:51:37 compute-0 sudo[19513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:37 compute-0 python3.9[19515]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759949497.1920602-581-162758505078983/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:51:37 compute-0 sudo[19513]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:38 compute-0 sudo[19589]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozgjqlxkaieuzuyzenvronmxomcawrum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949497.1920602-581-162758505078983/AnsiballZ_systemd.py'
Oct 08 18:51:38 compute-0 sudo[19589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:38 compute-0 python3.9[19591]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 08 18:51:38 compute-0 systemd[1]: Reloading.
Oct 08 18:51:38 compute-0 systemd-rc-local-generator[19614]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 18:51:38 compute-0 systemd-sysv-generator[19620]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 18:51:38 compute-0 sudo[19589]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:39 compute-0 sudo[19700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjfbmkjeluzthapiyqfvzyktxviqhklh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949497.1920602-581-162758505078983/AnsiballZ_systemd.py'
Oct 08 18:51:39 compute-0 sudo[19700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:39 compute-0 python3.9[19702]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 08 18:51:39 compute-0 systemd[1]: Reloading.
Oct 08 18:51:39 compute-0 systemd-rc-local-generator[19727]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 18:51:39 compute-0 systemd-sysv-generator[19733]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 18:51:39 compute-0 systemd[1]: Starting ovn_controller container...
Oct 08 18:51:40 compute-0 systemd[1]: Created slice Virtual Machine and Container Slice.
Oct 08 18:51:40 compute-0 systemd[1]: Started libcrun container.
Oct 08 18:51:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8f3dedad02a36764af72efb60d6a2f065ff7fe559c6f3cfd8cec593e78849fb5/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Oct 08 18:51:40 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59.
Oct 08 18:51:40 compute-0 podman[19743]: 2025-10-08 18:51:40.374599402 +0000 UTC m=+0.398483181 container init 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct 08 18:51:40 compute-0 podman[19743]: 2025-10-08 18:51:40.414980941 +0000 UTC m=+0.438864720 container start 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct 08 18:51:40 compute-0 edpm-start-podman-container[19743]: ovn_controller
Oct 08 18:51:40 compute-0 ovn_controller[19759]: + sudo -E kolla_set_configs
Oct 08 18:51:40 compute-0 edpm-start-podman-container[19742]: Creating additional drop-in dependency for "ovn_controller" (4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59)
Oct 08 18:51:40 compute-0 systemd[1]: Reloading.
Oct 08 18:51:40 compute-0 podman[19764]: 2025-10-08 18:51:40.577116265 +0000 UTC m=+0.151030146 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Oct 08 18:51:40 compute-0 systemd-rc-local-generator[19830]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 18:51:40 compute-0 systemd-sysv-generator[19835]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 18:51:40 compute-0 systemd[1]: 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59-1f119a9749d9ea73.service: Main process exited, code=exited, status=1/FAILURE
Oct 08 18:51:40 compute-0 systemd[1]: 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59-1f119a9749d9ea73.service: Failed with result 'exit-code'.
Oct 08 18:51:40 compute-0 systemd[1]: Started ovn_controller container.
Oct 08 18:51:40 compute-0 systemd[1]: Created slice User Slice of UID 0.
Oct 08 18:51:40 compute-0 systemd[1]: Starting User Runtime Directory /run/user/0...
Oct 08 18:51:40 compute-0 sudo[19700]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:40 compute-0 systemd[1]: Finished User Runtime Directory /run/user/0.
Oct 08 18:51:40 compute-0 systemd[1]: Starting User Manager for UID 0...
Oct 08 18:51:40 compute-0 systemd[1314]: Starting Mark boot as successful...
Oct 08 18:51:40 compute-0 systemd[19844]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Oct 08 18:51:40 compute-0 systemd[1314]: Finished Mark boot as successful.
Oct 08 18:51:40 compute-0 systemd[19844]: Queued start job for default target Main User Target.
Oct 08 18:51:40 compute-0 systemd[19844]: Created slice User Application Slice.
Oct 08 18:51:40 compute-0 systemd[19844]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Oct 08 18:51:40 compute-0 systemd[19844]: Started Daily Cleanup of User's Temporary Directories.
Oct 08 18:51:40 compute-0 systemd[19844]: Reached target Paths.
Oct 08 18:51:40 compute-0 systemd[19844]: Reached target Timers.
Oct 08 18:51:40 compute-0 systemd[19844]: Starting D-Bus User Message Bus Socket...
Oct 08 18:51:40 compute-0 systemd[19844]: Starting Create User's Volatile Files and Directories...
Oct 08 18:51:40 compute-0 systemd[19844]: Finished Create User's Volatile Files and Directories.
Oct 08 18:51:40 compute-0 systemd[19844]: Listening on D-Bus User Message Bus Socket.
Oct 08 18:51:40 compute-0 systemd[19844]: Reached target Sockets.
Oct 08 18:51:40 compute-0 systemd[19844]: Reached target Basic System.
Oct 08 18:51:40 compute-0 systemd[19844]: Reached target Main User Target.
Oct 08 18:51:40 compute-0 systemd[19844]: Startup finished in 99ms.
Oct 08 18:51:40 compute-0 systemd[1]: Started User Manager for UID 0.
Oct 08 18:51:40 compute-0 systemd[1]: Started Session c1 of User root.
Oct 08 18:51:41 compute-0 ovn_controller[19759]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 08 18:51:41 compute-0 ovn_controller[19759]: INFO:__main__:Validating config file
Oct 08 18:51:41 compute-0 ovn_controller[19759]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 08 18:51:41 compute-0 ovn_controller[19759]: INFO:__main__:Writing out command to execute
Oct 08 18:51:41 compute-0 systemd[1]: session-c1.scope: Deactivated successfully.
Oct 08 18:51:41 compute-0 ovn_controller[19759]: ++ cat /run_command
Oct 08 18:51:41 compute-0 ovn_controller[19759]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Oct 08 18:51:41 compute-0 ovn_controller[19759]: + ARGS=
Oct 08 18:51:41 compute-0 ovn_controller[19759]: + sudo kolla_copy_cacerts
Oct 08 18:51:41 compute-0 systemd[1]: Started Session c2 of User root.
Oct 08 18:51:41 compute-0 sudo[20013]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zyhyusshsmycmjcyrpgdgqgsfoeiyksu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949501.003208-609-196965230526457/AnsiballZ_command.py'
Oct 08 18:51:41 compute-0 sudo[20013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:41 compute-0 ovn_controller[19759]: + [[ ! -n '' ]]
Oct 08 18:51:41 compute-0 ovn_controller[19759]: + . kolla_extend_start
Oct 08 18:51:41 compute-0 ovn_controller[19759]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Oct 08 18:51:41 compute-0 ovn_controller[19759]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Oct 08 18:51:41 compute-0 ovn_controller[19759]: + umask 0022
Oct 08 18:51:41 compute-0 ovn_controller[19759]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Oct 08 18:51:41 compute-0 systemd[1]: session-c2.scope: Deactivated successfully.
Oct 08 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Oct 08 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Oct 08 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00003|main|INFO|OVN internal version is : [24.03.7-20.33.0-76.8]
Oct 08 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Oct 08 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Oct 08 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Oct 08 18:51:41 compute-0 NetworkManager[1035]: <info>  [1759949501.3924] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Oct 08 18:51:41 compute-0 NetworkManager[1035]: <info>  [1759949501.3933] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 08 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Oct 08 18:51:41 compute-0 NetworkManager[1035]: <info>  [1759949501.3949] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Oct 08 18:51:41 compute-0 kernel: br-int: entered promiscuous mode
Oct 08 18:51:41 compute-0 NetworkManager[1035]: <info>  [1759949501.3958] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Oct 08 18:51:41 compute-0 NetworkManager[1035]: <info>  [1759949501.3973] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Oct 08 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Oct 08 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct 08 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Oct 08 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Oct 08 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Oct 08 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Oct 08 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00014|main|INFO|OVS feature set changed, force recompute.
Oct 08 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Oct 08 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct 08 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct 08 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00018|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Oct 08 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00019|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct 08 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00020|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Oct 08 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00021|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Oct 08 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00022|main|INFO|OVS feature set changed, force recompute.
Oct 08 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Oct 08 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Oct 08 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Oct 08 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Oct 08 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct 08 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct 08 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct 08 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct 08 18:51:41 compute-0 NetworkManager[1035]: <info>  [1759949501.4166] manager: (ovn-98a9aa-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Oct 08 18:51:41 compute-0 kernel: genev_sys_6081: entered promiscuous mode
Oct 08 18:51:41 compute-0 NetworkManager[1035]: <info>  [1759949501.4395] device (genev_sys_6081): carrier: link connected
Oct 08 18:51:41 compute-0 NetworkManager[1035]: <info>  [1759949501.4398] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/18)
Oct 08 18:51:41 compute-0 systemd-udevd[20027]: Network interface NamePolicy= disabled on kernel command line.
Oct 08 18:51:41 compute-0 systemd-udevd[20030]: Network interface NamePolicy= disabled on kernel command line.
Oct 08 18:51:41 compute-0 python3.9[20016]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 08 18:51:41 compute-0 ovs-vsctl[20033]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Oct 08 18:51:41 compute-0 sudo[20013]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:42 compute-0 sudo[20183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdoulkshyasrnkgotfcoehsfffrtmtaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949501.7372274-617-280626450698742/AnsiballZ_command.py'
Oct 08 18:51:42 compute-0 sudo[20183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:42 compute-0 python3.9[20185]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 08 18:51:42 compute-0 ovs-vsctl[20187]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Oct 08 18:51:42 compute-0 sudo[20183]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:42 compute-0 sudo[20338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qnwzjfljmvndekcgzuuuossbjfcipbig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949502.7346046-631-192238468325436/AnsiballZ_command.py'
Oct 08 18:51:42 compute-0 sudo[20338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:43 compute-0 python3.9[20340]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 08 18:51:43 compute-0 ovs-vsctl[20341]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Oct 08 18:51:43 compute-0 sudo[20338]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:43 compute-0 sshd-session[9256]: Connection closed by 192.168.122.30 port 34034
Oct 08 18:51:43 compute-0 sshd-session[9253]: pam_unix(sshd:session): session closed for user zuul
Oct 08 18:51:43 compute-0 systemd[1]: session-4.scope: Deactivated successfully.
Oct 08 18:51:43 compute-0 systemd[1]: session-4.scope: Consumed 50.988s CPU time.
Oct 08 18:51:43 compute-0 systemd-logind[844]: Session 4 logged out. Waiting for processes to exit.
Oct 08 18:51:43 compute-0 systemd-logind[844]: Removed session 4.
Oct 08 18:51:49 compute-0 sshd-session[20366]: Accepted publickey for zuul from 192.168.122.30 port 32968 ssh2: ECDSA SHA256:i+73Mx2Y/ukt1b+huf+9w+ftZalnyybbDU6glTR0JfU
Oct 08 18:51:49 compute-0 systemd-logind[844]: New session 6 of user zuul.
Oct 08 18:51:49 compute-0 systemd[1]: Started Session 6 of User zuul.
Oct 08 18:51:49 compute-0 sshd-session[20366]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 08 18:51:50 compute-0 python3.9[20519]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 08 18:51:51 compute-0 systemd[1]: Stopping User Manager for UID 0...
Oct 08 18:51:51 compute-0 systemd[19844]: Activating special unit Exit the Session...
Oct 08 18:51:51 compute-0 systemd[19844]: Stopped target Main User Target.
Oct 08 18:51:51 compute-0 systemd[19844]: Stopped target Basic System.
Oct 08 18:51:51 compute-0 systemd[19844]: Stopped target Paths.
Oct 08 18:51:51 compute-0 systemd[19844]: Stopped target Sockets.
Oct 08 18:51:51 compute-0 systemd[19844]: Stopped target Timers.
Oct 08 18:51:51 compute-0 systemd[19844]: Stopped Daily Cleanup of User's Temporary Directories.
Oct 08 18:51:51 compute-0 systemd[19844]: Closed D-Bus User Message Bus Socket.
Oct 08 18:51:51 compute-0 systemd[19844]: Stopped Create User's Volatile Files and Directories.
Oct 08 18:51:51 compute-0 systemd[19844]: Removed slice User Application Slice.
Oct 08 18:51:51 compute-0 systemd[19844]: Reached target Shutdown.
Oct 08 18:51:51 compute-0 systemd[19844]: Finished Exit the Session.
Oct 08 18:51:51 compute-0 systemd[19844]: Reached target Exit the Session.
Oct 08 18:51:51 compute-0 systemd[1]: user@0.service: Deactivated successfully.
Oct 08 18:51:51 compute-0 systemd[1]: Stopped User Manager for UID 0.
Oct 08 18:51:51 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/0...
Oct 08 18:51:51 compute-0 systemd[1]: run-user-0.mount: Deactivated successfully.
Oct 08 18:51:51 compute-0 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Oct 08 18:51:51 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/0.
Oct 08 18:51:51 compute-0 systemd[1]: Removed slice User Slice of UID 0.
Oct 08 18:51:51 compute-0 sudo[20675]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vecpnicqwhbcbwhyklvpltlajqmalcew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949511.1032286-34-248435530593454/AnsiballZ_file.py'
Oct 08 18:51:51 compute-0 sudo[20675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:51 compute-0 python3.9[20677]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:51:51 compute-0 sudo[20675]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:52 compute-0 sudo[20827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwvvkzjhkslalqinxzpnevgxyhisvctr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949512.0840647-34-225034755440430/AnsiballZ_file.py'
Oct 08 18:51:52 compute-0 sudo[20827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:52 compute-0 python3.9[20829]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:51:52 compute-0 sudo[20827]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:53 compute-0 sudo[20979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzmoliofymufutpsudcqzpehkfjmebkt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949512.8142433-34-122913945661498/AnsiballZ_file.py'
Oct 08 18:51:53 compute-0 sudo[20979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:53 compute-0 python3.9[20981]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:51:53 compute-0 sudo[20979]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:53 compute-0 sudo[21131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wedknrjqqcysxmhzzqmzxlsyumxfpajl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949513.5480168-34-75048010410410/AnsiballZ_file.py'
Oct 08 18:51:53 compute-0 sudo[21131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:54 compute-0 python3.9[21133]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:51:54 compute-0 sudo[21131]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:54 compute-0 sudo[21283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqvwudmjvdiaumnvrjmcejfdeturyoor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949514.3282342-34-253558270053649/AnsiballZ_file.py'
Oct 08 18:51:54 compute-0 sudo[21283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:54 compute-0 python3.9[21285]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:51:54 compute-0 sudo[21283]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:55 compute-0 python3.9[21435]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 08 18:51:56 compute-0 sudo[21585]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnrlehuhrwbtuhrjbnwkcgyuajommxyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949515.968384-78-225665377883789/AnsiballZ_seboolean.py'
Oct 08 18:51:56 compute-0 sudo[21585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:51:56 compute-0 python3.9[21587]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Oct 08 18:51:57 compute-0 sudo[21585]: pam_unix(sudo:session): session closed for user root
Oct 08 18:51:58 compute-0 python3.9[21737]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:51:59 compute-0 python3.9[21858]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759949517.6913862-86-53439925079176/.source follow=False _original_basename=haproxy.j2 checksum=4bca74f6ee0b6450624d22997e2f90c414d58b44 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:51:59 compute-0 python3.9[22009]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:52:00 compute-0 python3.9[22130]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759949519.285496-101-219849640860557/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:52:01 compute-0 sudo[22280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkwsfsczesxudqtzhkywflawntguynpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949520.6715593-118-83076232753597/AnsiballZ_setup.py'
Oct 08 18:52:01 compute-0 sudo[22280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:52:01 compute-0 python3.9[22282]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 08 18:52:01 compute-0 sudo[22280]: pam_unix(sudo:session): session closed for user root
Oct 08 18:52:02 compute-0 sudo[22364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmkbmmwdqhbmnoddgrplpvpgwwsrrfes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949520.6715593-118-83076232753597/AnsiballZ_dnf.py'
Oct 08 18:52:02 compute-0 sudo[22364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:52:02 compute-0 python3.9[22366]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 08 18:52:03 compute-0 sudo[22364]: pam_unix(sudo:session): session closed for user root
Oct 08 18:52:04 compute-0 sudo[22517]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dsvpcvdaekjldzeyjfzpamfegmkqtmge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949523.643177-130-64041622306276/AnsiballZ_systemd.py'
Oct 08 18:52:04 compute-0 sudo[22517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:52:04 compute-0 python3.9[22519]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 08 18:52:04 compute-0 sudo[22517]: pam_unix(sudo:session): session closed for user root
Oct 08 18:52:05 compute-0 python3.9[22672]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:52:06 compute-0 python3.9[22793]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759949524.8942306-138-171508789279123/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:52:06 compute-0 python3.9[22943]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:52:07 compute-0 python3.9[23064]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759949526.2239397-138-260886726275346/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:52:08 compute-0 python3.9[23214]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:52:09 compute-0 python3.9[23335]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759949528.1397173-182-155786115814026/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:52:10 compute-0 python3.9[23485]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:52:10 compute-0 python3.9[23606]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759949529.519808-182-110034891505854/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:52:11 compute-0 ovn_controller[19759]: 2025-10-08T18:52:11Z|00025|memory|INFO|16128 kB peak resident set size after 29.9 seconds
Oct 08 18:52:11 compute-0 ovn_controller[19759]: 2025-10-08T18:52:11Z|00026|memory|INFO|idl-cells-OVN_Southbound:239 idl-cells-Open_vSwitch:471 ofctrl_desired_flow_usage-KB:5 ofctrl_installed_flow_usage-KB:4 ofctrl_sb_flow_ref_usage-KB:2
Oct 08 18:52:11 compute-0 podman[23730]: 2025-10-08 18:52:11.325229137 +0000 UTC m=+0.106607422 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct 08 18:52:11 compute-0 python3.9[23769]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 08 18:52:12 compute-0 sudo[23934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qorbyrjgmwiyncavlsmainbwbauzbfbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949531.6880176-220-31363619248571/AnsiballZ_file.py'
Oct 08 18:52:12 compute-0 sudo[23934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:52:12 compute-0 python3.9[23936]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:52:12 compute-0 sudo[23934]: pam_unix(sudo:session): session closed for user root
Oct 08 18:52:12 compute-0 sudo[24086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjosfvhycdsqxkvcjfpwvwmfeiarjtcn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949532.470732-228-231580696117177/AnsiballZ_stat.py'
Oct 08 18:52:12 compute-0 sudo[24086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:52:13 compute-0 python3.9[24088]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:52:13 compute-0 sudo[24086]: pam_unix(sudo:session): session closed for user root
Oct 08 18:52:13 compute-0 sudo[24164]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjrifriikvnaermrbwcxvgjolvaqrqtt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949532.470732-228-231580696117177/AnsiballZ_file.py'
Oct 08 18:52:13 compute-0 sudo[24164]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:52:13 compute-0 python3.9[24166]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:52:13 compute-0 sudo[24164]: pam_unix(sudo:session): session closed for user root
Oct 08 18:52:14 compute-0 sudo[24316]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nirnwumczoauohygszwsibjacscgodxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949533.823584-228-55586769579003/AnsiballZ_stat.py'
Oct 08 18:52:14 compute-0 sudo[24316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:52:14 compute-0 python3.9[24318]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:52:14 compute-0 sudo[24316]: pam_unix(sudo:session): session closed for user root
Oct 08 18:52:14 compute-0 sudo[24394]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcnuwlfjipzxcarjuwcjehymhywhbseo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949533.823584-228-55586769579003/AnsiballZ_file.py'
Oct 08 18:52:14 compute-0 sudo[24394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:52:14 compute-0 python3.9[24396]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:52:14 compute-0 sudo[24394]: pam_unix(sudo:session): session closed for user root
Oct 08 18:52:15 compute-0 sudo[24546]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vniqzyljoaolofoqwdkjisxnrrypzoae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949535.1619375-251-61203272939735/AnsiballZ_file.py'
Oct 08 18:52:15 compute-0 sudo[24546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:52:15 compute-0 python3.9[24548]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:52:15 compute-0 sudo[24546]: pam_unix(sudo:session): session closed for user root
Oct 08 18:52:16 compute-0 sudo[24698]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjxyapiipkkdaulqkadbdongvyeftiue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949535.919674-259-11018554360538/AnsiballZ_stat.py'
Oct 08 18:52:16 compute-0 sudo[24698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:52:16 compute-0 python3.9[24700]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:52:16 compute-0 sudo[24698]: pam_unix(sudo:session): session closed for user root
Oct 08 18:52:16 compute-0 sudo[24778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upwuevsovweiqydzjlgsaocseqnlwqnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949535.919674-259-11018554360538/AnsiballZ_file.py'
Oct 08 18:52:16 compute-0 sudo[24778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:52:17 compute-0 python3.9[24780]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:52:17 compute-0 sudo[24778]: pam_unix(sudo:session): session closed for user root
Oct 08 18:52:17 compute-0 unix_chkpwd[24904]: password check failed for user (root)
Oct 08 18:52:17 compute-0 sshd-session[24758]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.176  user=root
Oct 08 18:52:17 compute-0 sudo[24931]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvuvknaqsaudofmiluchsrxdsrllkjny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949537.283585-271-64039546185516/AnsiballZ_stat.py'
Oct 08 18:52:17 compute-0 sudo[24931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:52:17 compute-0 python3.9[24933]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:52:17 compute-0 sudo[24931]: pam_unix(sudo:session): session closed for user root
Oct 08 18:52:18 compute-0 sudo[25009]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttmmroadxtxpqjsttkdxxfojpgteopjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949537.283585-271-64039546185516/AnsiballZ_file.py'
Oct 08 18:52:18 compute-0 sudo[25009]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:52:18 compute-0 python3.9[25011]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:52:18 compute-0 sudo[25009]: pam_unix(sudo:session): session closed for user root
Oct 08 18:52:18 compute-0 sudo[25161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbfzqjshcchzdailpahrxynccjawtvop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949538.582925-283-8691343984921/AnsiballZ_systemd.py'
Oct 08 18:52:18 compute-0 sudo[25161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:52:19 compute-0 python3.9[25163]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 08 18:52:19 compute-0 systemd[1]: Reloading.
Oct 08 18:52:19 compute-0 systemd-sysv-generator[25195]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 18:52:19 compute-0 systemd-rc-local-generator[25190]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 18:52:19 compute-0 sshd-session[24758]: Failed password for root from 80.94.93.176 port 46570 ssh2
Oct 08 18:52:19 compute-0 sudo[25161]: pam_unix(sudo:session): session closed for user root
Oct 08 18:52:20 compute-0 unix_chkpwd[25334]: password check failed for user (root)
Oct 08 18:52:20 compute-0 sudo[25352]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eszrgvfeetkpkvyjvanjurqasovuvqkj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949539.7211401-291-168592645461986/AnsiballZ_stat.py'
Oct 08 18:52:20 compute-0 sudo[25352]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:52:20 compute-0 python3.9[25354]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:52:20 compute-0 sudo[25352]: pam_unix(sudo:session): session closed for user root
Oct 08 18:52:20 compute-0 sudo[25430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqrxwtgnwufzukkwvjcvmrjvshcchzku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949539.7211401-291-168592645461986/AnsiballZ_file.py'
Oct 08 18:52:20 compute-0 sudo[25430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:52:20 compute-0 python3.9[25432]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:52:20 compute-0 sudo[25430]: pam_unix(sudo:session): session closed for user root
Oct 08 18:52:21 compute-0 sudo[25582]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rihwuwbydpjdsnfxavngklnljkylpaqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949541.022887-303-88493352544037/AnsiballZ_stat.py'
Oct 08 18:52:21 compute-0 sudo[25582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:52:21 compute-0 python3.9[25584]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:52:21 compute-0 sudo[25582]: pam_unix(sudo:session): session closed for user root
Oct 08 18:52:21 compute-0 sudo[25660]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwzcnwuyebdyaxeihtkuuvxgnbcgzukl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949541.022887-303-88493352544037/AnsiballZ_file.py'
Oct 08 18:52:21 compute-0 sudo[25660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:52:21 compute-0 python3.9[25662]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:52:22 compute-0 sudo[25660]: pam_unix(sudo:session): session closed for user root
Oct 08 18:52:22 compute-0 sshd-session[24758]: Failed password for root from 80.94.93.176 port 46570 ssh2
Oct 08 18:52:22 compute-0 unix_chkpwd[25803]: password check failed for user (root)
Oct 08 18:52:22 compute-0 sudo[25813]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pemriqntaynendttygownqauwtjldzqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949542.1754842-315-133958407495352/AnsiballZ_systemd.py'
Oct 08 18:52:22 compute-0 sudo[25813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:52:22 compute-0 python3.9[25815]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 08 18:52:22 compute-0 systemd[1]: Reloading.
Oct 08 18:52:23 compute-0 systemd-rc-local-generator[25838]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 18:52:23 compute-0 systemd-sysv-generator[25841]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 18:52:23 compute-0 systemd[1]: Starting Create netns directory...
Oct 08 18:52:23 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 08 18:52:23 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 08 18:52:23 compute-0 systemd[1]: Finished Create netns directory.
Oct 08 18:52:23 compute-0 sudo[25813]: pam_unix(sudo:session): session closed for user root
Oct 08 18:52:23 compute-0 sudo[26006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcuyumphnjamhbheuyidhzlfjbuejlmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949543.5409226-325-171852148359880/AnsiballZ_file.py'
Oct 08 18:52:23 compute-0 sudo[26006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:52:24 compute-0 python3.9[26008]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:52:24 compute-0 sshd-session[24758]: Failed password for root from 80.94.93.176 port 46570 ssh2
Oct 08 18:52:24 compute-0 sudo[26006]: pam_unix(sudo:session): session closed for user root
Oct 08 18:52:24 compute-0 sudo[26158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-stkahhyuhqgszgkrxohmwkekyshezndg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949544.3555024-333-262336270419425/AnsiballZ_stat.py'
Oct 08 18:52:24 compute-0 sudo[26158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:52:24 compute-0 sshd-session[24758]: Received disconnect from 80.94.93.176 port 46570:11:  [preauth]
Oct 08 18:52:24 compute-0 sshd-session[24758]: Disconnected from authenticating user root 80.94.93.176 port 46570 [preauth]
Oct 08 18:52:24 compute-0 sshd-session[24758]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.176  user=root
Oct 08 18:52:25 compute-0 python3.9[26160]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:52:25 compute-0 sudo[26158]: pam_unix(sudo:session): session closed for user root
Oct 08 18:52:25 compute-0 sudo[26283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftwhugbdftkgykvyuyftisbjdwvzglyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949544.3555024-333-262336270419425/AnsiballZ_copy.py'
Oct 08 18:52:25 compute-0 sudo[26283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:52:25 compute-0 python3.9[26285]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759949544.3555024-333-262336270419425/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:52:25 compute-0 sudo[26283]: pam_unix(sudo:session): session closed for user root
Oct 08 18:52:26 compute-0 unix_chkpwd[26385]: password check failed for user (root)
Oct 08 18:52:26 compute-0 sshd-session[26166]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.176  user=root
Oct 08 18:52:26 compute-0 sudo[26436]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plfkaerpkarsmajxhtxhrnnddzdcsjml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949546.0014865-350-205926563146861/AnsiballZ_file.py'
Oct 08 18:52:26 compute-0 sudo[26436]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:52:26 compute-0 python3.9[26438]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:52:26 compute-0 sudo[26436]: pam_unix(sudo:session): session closed for user root
Oct 08 18:52:27 compute-0 sudo[26588]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmwkzgtmqwrwczlcniydpgpnkubesfex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949546.7347236-358-278074340770702/AnsiballZ_stat.py'
Oct 08 18:52:27 compute-0 sudo[26588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:52:27 compute-0 python3.9[26590]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:52:27 compute-0 sudo[26588]: pam_unix(sudo:session): session closed for user root
Oct 08 18:52:27 compute-0 sudo[26711]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpmgynznxbxbhnxyzdcvhedpguweiaxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949546.7347236-358-278074340770702/AnsiballZ_copy.py'
Oct 08 18:52:27 compute-0 sudo[26711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:52:27 compute-0 python3.9[26713]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759949546.7347236-358-278074340770702/.source.json _original_basename=.vi1hzfr1 follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:52:27 compute-0 sudo[26711]: pam_unix(sudo:session): session closed for user root
Oct 08 18:52:28 compute-0 sshd-session[26166]: Failed password for root from 80.94.93.176 port 62994 ssh2
Oct 08 18:52:28 compute-0 sudo[26863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulkyxfgsmpsulsiulmvfhsydoecobwaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949548.121948-373-147317570120270/AnsiballZ_file.py'
Oct 08 18:52:28 compute-0 sudo[26863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:52:28 compute-0 unix_chkpwd[26866]: password check failed for user (root)
Oct 08 18:52:28 compute-0 python3.9[26865]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:52:28 compute-0 sudo[26863]: pam_unix(sudo:session): session closed for user root
Oct 08 18:52:29 compute-0 sudo[27016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnyaoweejfsdkhchljahivufalwldiro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949549.068485-381-94836408553405/AnsiballZ_stat.py'
Oct 08 18:52:29 compute-0 sudo[27016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:52:29 compute-0 sudo[27016]: pam_unix(sudo:session): session closed for user root
Oct 08 18:52:30 compute-0 sudo[27139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfgzksscpnpawyhoyqwkhlszohscqapv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949549.068485-381-94836408553405/AnsiballZ_copy.py'
Oct 08 18:52:30 compute-0 sudo[27139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:52:30 compute-0 sudo[27139]: pam_unix(sudo:session): session closed for user root
Oct 08 18:52:30 compute-0 sshd-session[26166]: Failed password for root from 80.94.93.176 port 62994 ssh2
Oct 08 18:52:31 compute-0 unix_chkpwd[27291]: password check failed for user (root)
Oct 08 18:52:31 compute-0 sudo[27292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptaxzlwtxyrgpahlgdscmozpgtybkmro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949550.6661508-398-50725265508795/AnsiballZ_container_config_data.py'
Oct 08 18:52:31 compute-0 sudo[27292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:52:31 compute-0 python3.9[27294]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Oct 08 18:52:31 compute-0 sudo[27292]: pam_unix(sudo:session): session closed for user root
Oct 08 18:52:32 compute-0 sudo[27444]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixyxtdgrkiiuqpkoxitpwbjabbroicmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949551.6087306-407-93818331096060/AnsiballZ_container_config_hash.py'
Oct 08 18:52:32 compute-0 sudo[27444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:52:32 compute-0 python3.9[27446]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 08 18:52:32 compute-0 sudo[27444]: pam_unix(sudo:session): session closed for user root
Oct 08 18:52:32 compute-0 sudo[27596]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qnyncosvpewjmyzvvjubizyevwbthikn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949552.532736-416-204270481111592/AnsiballZ_podman_container_info.py'
Oct 08 18:52:32 compute-0 sudo[27596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:52:33 compute-0 sshd-session[26166]: Failed password for root from 80.94.93.176 port 62994 ssh2
Oct 08 18:52:33 compute-0 python3.9[27598]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct 08 18:52:33 compute-0 sudo[27596]: pam_unix(sudo:session): session closed for user root
Oct 08 18:52:33 compute-0 sshd-session[26166]: Received disconnect from 80.94.93.176 port 62994:11:  [preauth]
Oct 08 18:52:33 compute-0 sshd-session[26166]: Disconnected from authenticating user root 80.94.93.176 port 62994 [preauth]
Oct 08 18:52:33 compute-0 sshd-session[26166]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.176  user=root
Oct 08 18:52:34 compute-0 sudo[27776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vyjvvqcnpvgwixlctsystvcrctdwquut ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759949553.8596678-429-228027033792512/AnsiballZ_edpm_container_manage.py'
Oct 08 18:52:34 compute-0 sudo[27776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:52:34 compute-0 unix_chkpwd[27779]: password check failed for user (root)
Oct 08 18:52:34 compute-0 sshd-session[27649]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.176  user=root
Oct 08 18:52:34 compute-0 python3[27778]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct 08 18:52:34 compute-0 podman[27817]: 2025-10-08 18:52:34.899525688 +0000 UTC m=+0.038155287 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7
Oct 08 18:52:35 compute-0 podman[27817]: 2025-10-08 18:52:35.168556674 +0000 UTC m=+0.307186203 container create 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Oct 08 18:52:35 compute-0 python3[27778]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7
Oct 08 18:52:35 compute-0 sudo[27776]: pam_unix(sudo:session): session closed for user root
Oct 08 18:52:35 compute-0 sudo[28007]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akeelpxjqcdvqtfoozppggthulkyymnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949555.53021-437-220159028344454/AnsiballZ_stat.py'
Oct 08 18:52:35 compute-0 sudo[28007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:52:36 compute-0 sshd-session[27649]: Failed password for root from 80.94.93.176 port 62996 ssh2
Oct 08 18:52:36 compute-0 python3.9[28009]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 08 18:52:36 compute-0 sudo[28007]: pam_unix(sudo:session): session closed for user root
Oct 08 18:52:36 compute-0 sudo[28161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chyekqrdiishalzrpmvwsuenaxfucbog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949556.3921125-446-71068897857638/AnsiballZ_file.py'
Oct 08 18:52:36 compute-0 sudo[28161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:52:36 compute-0 unix_chkpwd[28164]: password check failed for user (root)
Oct 08 18:52:36 compute-0 python3.9[28163]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:52:36 compute-0 sudo[28161]: pam_unix(sudo:session): session closed for user root
Oct 08 18:52:37 compute-0 sudo[28238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhklfhohxrihhacsekctwksfhogryrkd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949556.3921125-446-71068897857638/AnsiballZ_stat.py'
Oct 08 18:52:37 compute-0 sudo[28238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:52:37 compute-0 python3.9[28240]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 08 18:52:37 compute-0 sudo[28238]: pam_unix(sudo:session): session closed for user root
Oct 08 18:52:38 compute-0 sudo[28389]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zyeexkrnxcnjstiqzagjrluzlizgxben ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949557.613259-446-219813364589479/AnsiballZ_copy.py'
Oct 08 18:52:38 compute-0 sudo[28389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:52:38 compute-0 python3.9[28391]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759949557.613259-446-219813364589479/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:52:38 compute-0 sudo[28389]: pam_unix(sudo:session): session closed for user root
Oct 08 18:52:38 compute-0 sudo[28465]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpehucbfovumazlhirnlnhjymooksssn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949557.613259-446-219813364589479/AnsiballZ_systemd.py'
Oct 08 18:52:38 compute-0 sudo[28465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:52:39 compute-0 python3.9[28467]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 08 18:52:39 compute-0 systemd[1]: Reloading.
Oct 08 18:52:39 compute-0 systemd-rc-local-generator[28490]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 18:52:39 compute-0 systemd-sysv-generator[28496]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 18:52:39 compute-0 sshd-session[27649]: Failed password for root from 80.94.93.176 port 62996 ssh2
Oct 08 18:52:39 compute-0 sudo[28465]: pam_unix(sudo:session): session closed for user root
Oct 08 18:52:39 compute-0 unix_chkpwd[28511]: password check failed for user (root)
Oct 08 18:52:39 compute-0 sudo[28577]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahfizswanchrjuxpyxyjhmkjwcirfavw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949557.613259-446-219813364589479/AnsiballZ_systemd.py'
Oct 08 18:52:39 compute-0 sudo[28577]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:52:40 compute-0 python3.9[28579]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 08 18:52:40 compute-0 systemd[1]: Reloading.
Oct 08 18:52:40 compute-0 systemd-sysv-generator[28614]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 18:52:40 compute-0 systemd-rc-local-generator[28610]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 18:52:40 compute-0 systemd[1]: Starting ovn_metadata_agent container...
Oct 08 18:52:40 compute-0 systemd[1]: Started libcrun container.
Oct 08 18:52:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/62ed697a92c2e27de8b9b8411fab5fe0db1b62146968c509eda8ca855c6aea8b/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Oct 08 18:52:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/62ed697a92c2e27de8b9b8411fab5fe0db1b62146968c509eda8ca855c6aea8b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 08 18:52:40 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a.
Oct 08 18:52:40 compute-0 podman[28621]: 2025-10-08 18:52:40.626118484 +0000 UTC m=+0.268880153 container init 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct 08 18:52:40 compute-0 ovn_metadata_agent[28637]: + sudo -E kolla_set_configs
Oct 08 18:52:40 compute-0 podman[28621]: 2025-10-08 18:52:40.658825273 +0000 UTC m=+0.301586852 container start 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Oct 08 18:52:40 compute-0 edpm-start-podman-container[28621]: ovn_metadata_agent
Oct 08 18:52:40 compute-0 podman[28642]: 2025-10-08 18:52:40.737692508 +0000 UTC m=+0.064666118 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct 08 18:52:40 compute-0 edpm-start-podman-container[28620]: Creating additional drop-in dependency for "ovn_metadata_agent" (80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a)
Oct 08 18:52:40 compute-0 ovn_metadata_agent[28637]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 08 18:52:40 compute-0 ovn_metadata_agent[28637]: INFO:__main__:Validating config file
Oct 08 18:52:40 compute-0 ovn_metadata_agent[28637]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 08 18:52:40 compute-0 ovn_metadata_agent[28637]: INFO:__main__:Copying service configuration files
Oct 08 18:52:40 compute-0 ovn_metadata_agent[28637]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Oct 08 18:52:40 compute-0 ovn_metadata_agent[28637]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Oct 08 18:52:40 compute-0 ovn_metadata_agent[28637]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Oct 08 18:52:40 compute-0 ovn_metadata_agent[28637]: INFO:__main__:Writing out command to execute
Oct 08 18:52:40 compute-0 ovn_metadata_agent[28637]: INFO:__main__:Setting permission for /var/lib/neutron
Oct 08 18:52:40 compute-0 ovn_metadata_agent[28637]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Oct 08 18:52:40 compute-0 ovn_metadata_agent[28637]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Oct 08 18:52:40 compute-0 ovn_metadata_agent[28637]: INFO:__main__:Setting permission for /var/lib/neutron/external
Oct 08 18:52:40 compute-0 ovn_metadata_agent[28637]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Oct 08 18:52:40 compute-0 ovn_metadata_agent[28637]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Oct 08 18:52:40 compute-0 ovn_metadata_agent[28637]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Oct 08 18:52:40 compute-0 ovn_metadata_agent[28637]: ++ cat /run_command
Oct 08 18:52:40 compute-0 ovn_metadata_agent[28637]: + CMD=neutron-ovn-metadata-agent
Oct 08 18:52:40 compute-0 ovn_metadata_agent[28637]: + ARGS=
Oct 08 18:52:40 compute-0 ovn_metadata_agent[28637]: + sudo kolla_copy_cacerts
Oct 08 18:52:40 compute-0 systemd[1]: Reloading.
Oct 08 18:52:40 compute-0 ovn_metadata_agent[28637]: + [[ ! -n '' ]]
Oct 08 18:52:40 compute-0 ovn_metadata_agent[28637]: + . kolla_extend_start
Oct 08 18:52:40 compute-0 ovn_metadata_agent[28637]: Running command: 'neutron-ovn-metadata-agent'
Oct 08 18:52:40 compute-0 ovn_metadata_agent[28637]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Oct 08 18:52:40 compute-0 ovn_metadata_agent[28637]: + umask 0022
Oct 08 18:52:40 compute-0 ovn_metadata_agent[28637]: + exec neutron-ovn-metadata-agent
Oct 08 18:52:40 compute-0 systemd-rc-local-generator[28717]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 18:52:40 compute-0 systemd-sysv-generator[28721]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 18:52:41 compute-0 systemd[1]: Started ovn_metadata_agent container.
Oct 08 18:52:41 compute-0 sudo[28577]: pam_unix(sudo:session): session closed for user root
Oct 08 18:52:41 compute-0 sshd-session[20369]: Connection closed by 192.168.122.30 port 32968
Oct 08 18:52:41 compute-0 sshd-session[20366]: pam_unix(sshd:session): session closed for user zuul
Oct 08 18:52:41 compute-0 systemd-logind[844]: Session 6 logged out. Waiting for processes to exit.
Oct 08 18:52:41 compute-0 systemd[1]: session-6.scope: Deactivated successfully.
Oct 08 18:52:41 compute-0 systemd[1]: session-6.scope: Consumed 40.272s CPU time.
Oct 08 18:52:41 compute-0 sshd-session[27649]: Failed password for root from 80.94.93.176 port 62996 ssh2
Oct 08 18:52:41 compute-0 systemd-logind[844]: Removed session 6.
Oct 08 18:52:41 compute-0 podman[28749]: 2025-10-08 18:52:41.603246714 +0000 UTC m=+0.142465711 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 08 18:52:41 compute-0 sshd-session[27649]: Received disconnect from 80.94.93.176 port 62996:11:  [preauth]
Oct 08 18:52:41 compute-0 sshd-session[27649]: Disconnected from authenticating user root 80.94.93.176 port 62996 [preauth]
Oct 08 18:52:41 compute-0 sshd-session[27649]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.176  user=root
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.165 28643 INFO neutron.common.config [-] Logging enabled!
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.166 28643 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.166 28643 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.166 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.166 28643 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.166 28643 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.167 28643 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.167 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.167 28643 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.167 28643 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.167 28643 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.167 28643 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.167 28643 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.167 28643 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.167 28643 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.167 28643 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.168 28643 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.168 28643 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.168 28643 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.168 28643 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.168 28643 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.168 28643 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.168 28643 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.168 28643 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.168 28643 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.168 28643 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.169 28643 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.169 28643 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.169 28643 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.169 28643 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.169 28643 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.169 28643 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.169 28643 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.169 28643 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.169 28643 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.170 28643 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.170 28643 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.170 28643 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.170 28643 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.170 28643 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.170 28643 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.170 28643 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.170 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.171 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.171 28643 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.171 28643 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.171 28643 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.171 28643 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.171 28643 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.171 28643 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.171 28643 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.171 28643 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.171 28643 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.171 28643 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.171 28643 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.172 28643 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.172 28643 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.172 28643 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.172 28643 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.172 28643 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.172 28643 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.172 28643 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.172 28643 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.172 28643 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.173 28643 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.173 28643 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.173 28643 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.173 28643 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.173 28643 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.173 28643 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.173 28643 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.173 28643 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.173 28643 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.174 28643 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.174 28643 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.174 28643 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.174 28643 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.174 28643 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.174 28643 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.174 28643 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.174 28643 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.174 28643 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.175 28643 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.175 28643 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.175 28643 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.175 28643 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.175 28643 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.175 28643 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.175 28643 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.175 28643 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.175 28643 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.176 28643 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.176 28643 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.176 28643 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.176 28643 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.176 28643 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.176 28643 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.176 28643 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.176 28643 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.176 28643 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.176 28643 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.177 28643 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.177 28643 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.177 28643 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.177 28643 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.177 28643 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.177 28643 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.177 28643 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.177 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.177 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.178 28643 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.178 28643 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.178 28643 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.178 28643 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.178 28643 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.178 28643 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.178 28643 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.178 28643 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.178 28643 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.178 28643 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.179 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.179 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.179 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.179 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.179 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.179 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.179 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.179 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.179 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.180 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.180 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.180 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.180 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.180 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.180 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.180 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.180 28643 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.181 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.181 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.181 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.181 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.181 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.181 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.181 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.181 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.181 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.181 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.182 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.182 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.182 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.182 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.182 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.182 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.182 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.182 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.182 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.182 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.183 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.183 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.183 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.183 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.183 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.183 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.183 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.183 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.183 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.183 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.184 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.184 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.184 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.184 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.184 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.184 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.184 28643 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.184 28643 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.184 28643 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.185 28643 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.185 28643 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.185 28643 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.185 28643 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.185 28643 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.185 28643 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.185 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.185 28643 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.185 28643 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.186 28643 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.186 28643 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.186 28643 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.186 28643 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.186 28643 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.186 28643 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.186 28643 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.186 28643 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.187 28643 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.187 28643 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.187 28643 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.187 28643 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.187 28643 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.187 28643 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.187 28643 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.187 28643 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.187 28643 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.187 28643 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.188 28643 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.188 28643 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.188 28643 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.188 28643 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.188 28643 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.188 28643 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.188 28643 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.189 28643 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.189 28643 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.189 28643 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.189 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.189 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.189 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.189 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.189 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.189 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.190 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.190 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.190 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.190 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.190 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.190 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.190 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.190 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.190 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.191 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.191 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.191 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.191 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.191 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.191 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.191 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.191 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.191 28643 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.192 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.192 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.192 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.192 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.192 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.192 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.192 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.192 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.192 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.193 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.193 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.193 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.193 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.193 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.193 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.193 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.193 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.193 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.193 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.194 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.194 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.194 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.194 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.194 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.194 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.194 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.194 28643 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.194 28643 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.195 28643 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.195 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.195 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.195 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.195 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.195 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.195 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.195 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.196 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.196 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.196 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.196 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.196 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.196 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.196 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.196 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.196 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.196 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.197 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.197 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.197 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.197 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.197 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.197 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.197 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.197 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.197 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.197 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.198 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.198 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.198 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.198 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.198 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.198 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.198 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.198 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.198 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.198 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.199 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.207 28643 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.208 28643 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.208 28643 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.208 28643 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.208 28643 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.219 28643 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 47f81f7a-64d8-418a-a74c-b879bd6deb83 (UUID: 47f81f7a-64d8-418a-a74c-b879bd6deb83) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.243 28643 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.243 28643 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.243 28643 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.243 28643 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.246 28643 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.252 28643 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.256 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '47f81f7a-64d8-418a-a74c-b879bd6deb83'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], external_ids={}, name=47f81f7a-64d8-418a-a74c-b879bd6deb83, nb_cfg_timestamp=1759949509419, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.257 28643 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f9cb27900a0>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.258 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.258 28643 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.258 28643 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.258 28643 INFO oslo_service.service [-] Starting 1 workers
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.262 28643 DEBUG oslo_service.service [-] Started child 28778 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.266 28643 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpk62ej7qg/privsep.sock']
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.267 28778 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-891676'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.303 28778 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.304 28778 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.304 28778 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.309 28778 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.317 28778 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.326 28778 INFO eventlet.wsgi.server [-] (28778) wsgi starting up on http:/var/lib/neutron/metadata_proxy
Oct 08 18:52:44 compute-0 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.992 28643 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.993 28643 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpk62ej7qg/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.876 28783 INFO oslo.privsep.daemon [-] privsep daemon starting
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.884 28783 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.887 28783 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.888 28783 INFO oslo.privsep.daemon [-] privsep daemon running as pid 28783
Oct 08 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.998 28783 DEBUG oslo.privsep.daemon [-] privsep: reply[99dc8d27-7c6b-487f-915a-3e2d20899944]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.431 28783 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.431 28783 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.431 28783 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.898 28783 DEBUG oslo.privsep.daemon [-] privsep: reply[3a3b2da1-2766-4d54-b42f-866a4d5ee3ca]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.901 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=47f81f7a-64d8-418a-a74c-b879bd6deb83, column=external_ids, values=({'neutron:ovn-metadata-id': '848359ed-b94c-5960-a0fa-54c8b235d5a5'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.911 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f81f7a-64d8-418a-a74c-b879bd6deb83, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.918 28643 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.919 28643 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.919 28643 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.919 28643 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.919 28643 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.919 28643 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.920 28643 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.920 28643 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.920 28643 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.920 28643 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.921 28643 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.921 28643 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.921 28643 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.921 28643 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.922 28643 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.922 28643 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.922 28643 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.923 28643 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.923 28643 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.923 28643 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.923 28643 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.923 28643 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.924 28643 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.924 28643 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.924 28643 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.925 28643 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.925 28643 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.925 28643 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.925 28643 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.926 28643 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.926 28643 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.926 28643 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.926 28643 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.927 28643 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.927 28643 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.927 28643 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.928 28643 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.928 28643 DEBUG oslo_service.service [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.928 28643 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.929 28643 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.929 28643 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.929 28643 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.929 28643 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.930 28643 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.930 28643 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.930 28643 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.930 28643 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.931 28643 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.931 28643 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.931 28643 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.932 28643 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.932 28643 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.932 28643 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.932 28643 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.933 28643 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.933 28643 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.933 28643 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.934 28643 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.934 28643 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.934 28643 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.934 28643 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.935 28643 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.935 28643 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.935 28643 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.935 28643 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.935 28643 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.936 28643 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.936 28643 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.936 28643 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.936 28643 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.937 28643 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.937 28643 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.937 28643 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.937 28643 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.938 28643 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.938 28643 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.938 28643 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.938 28643 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.939 28643 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.939 28643 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.939 28643 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.939 28643 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.940 28643 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.940 28643 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.940 28643 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.940 28643 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.941 28643 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.941 28643 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.941 28643 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.941 28643 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.942 28643 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.942 28643 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.942 28643 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.942 28643 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.942 28643 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.943 28643 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.943 28643 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.943 28643 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.943 28643 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.944 28643 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.944 28643 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.944 28643 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.944 28643 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.944 28643 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.945 28643 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.945 28643 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.945 28643 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.945 28643 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.946 28643 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.946 28643 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.946 28643 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.947 28643 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.947 28643 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.947 28643 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.947 28643 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.947 28643 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.948 28643 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.948 28643 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.948 28643 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.949 28643 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.949 28643 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.949 28643 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.949 28643 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.950 28643 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.950 28643 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.950 28643 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.951 28643 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.951 28643 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.951 28643 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.951 28643 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.951 28643 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.952 28643 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.952 28643 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.952 28643 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.952 28643 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.953 28643 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.953 28643 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.953 28643 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.953 28643 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.954 28643 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.954 28643 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.954 28643 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.954 28643 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.955 28643 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.955 28643 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.955 28643 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.955 28643 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.955 28643 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.956 28643 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.956 28643 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.956 28643 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.956 28643 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.957 28643 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.957 28643 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.957 28643 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.957 28643 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.957 28643 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.958 28643 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.958 28643 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.958 28643 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.958 28643 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.958 28643 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.959 28643 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.959 28643 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.959 28643 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.959 28643 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.960 28643 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.960 28643 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.960 28643 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.960 28643 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.960 28643 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.961 28643 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.961 28643 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.961 28643 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.961 28643 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.962 28643 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.962 28643 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.962 28643 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.962 28643 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.962 28643 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.963 28643 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.963 28643 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.963 28643 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.963 28643 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.963 28643 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.963 28643 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.963 28643 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.964 28643 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.964 28643 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.964 28643 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.964 28643 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.964 28643 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.964 28643 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.964 28643 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.965 28643 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.965 28643 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.965 28643 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.965 28643 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.965 28643 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.965 28643 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.965 28643 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.966 28643 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.966 28643 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.966 28643 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.966 28643 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.966 28643 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.966 28643 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.966 28643 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.967 28643 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.967 28643 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.967 28643 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.967 28643 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.967 28643 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.967 28643 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.967 28643 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.967 28643 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.968 28643 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.968 28643 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.968 28643 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.968 28643 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.968 28643 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.968 28643 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.968 28643 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.969 28643 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.969 28643 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.969 28643 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.969 28643 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.969 28643 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.969 28643 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.969 28643 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.970 28643 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.970 28643 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.970 28643 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.970 28643 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.970 28643 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.970 28643 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.970 28643 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.971 28643 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.971 28643 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.971 28643 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.971 28643 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.971 28643 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.971 28643 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.971 28643 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.972 28643 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.972 28643 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.972 28643 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.972 28643 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.972 28643 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.972 28643 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.972 28643 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.973 28643 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.973 28643 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.973 28643 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.973 28643 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.973 28643 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.973 28643 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.974 28643 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.974 28643 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.974 28643 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.974 28643 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.974 28643 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.974 28643 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.974 28643 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.974 28643 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.975 28643 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.975 28643 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.975 28643 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.975 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.975 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.975 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.976 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.976 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.976 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.976 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.976 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.976 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.976 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.977 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.977 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.977 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.977 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.977 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.977 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.977 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.978 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.978 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.978 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.978 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.978 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.978 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.979 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.979 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.979 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.979 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.979 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.979 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.979 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.980 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.980 28643 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.980 28643 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.980 28643 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.980 28643 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.980 28643 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Oct 08 18:52:46 compute-0 sshd-session[28788]: Accepted publickey for zuul from 192.168.122.30 port 36532 ssh2: ECDSA SHA256:i+73Mx2Y/ukt1b+huf+9w+ftZalnyybbDU6glTR0JfU
Oct 08 18:52:46 compute-0 systemd-logind[844]: New session 7 of user zuul.
Oct 08 18:52:46 compute-0 systemd[1]: Started Session 7 of User zuul.
Oct 08 18:52:46 compute-0 sshd-session[28788]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 08 18:52:47 compute-0 python3.9[28941]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 08 18:52:48 compute-0 sudo[29095]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvuuljgnhmywqronxngdjtyromaeczeg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949568.324407-34-154978430301760/AnsiballZ_command.py'
Oct 08 18:52:48 compute-0 sudo[29095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:52:49 compute-0 python3.9[29097]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 08 18:52:49 compute-0 sudo[29095]: pam_unix(sudo:session): session closed for user root
Oct 08 18:52:50 compute-0 sudo[29260]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kudbrmsfbnhkcjhqnfborhbisbgsihyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949569.5829542-45-164691663651366/AnsiballZ_systemd_service.py'
Oct 08 18:52:50 compute-0 sudo[29260]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:52:50 compute-0 python3.9[29262]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 08 18:52:50 compute-0 systemd[1]: Reloading.
Oct 08 18:52:50 compute-0 systemd-sysv-generator[29285]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 18:52:50 compute-0 systemd-rc-local-generator[29280]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 18:52:50 compute-0 sudo[29260]: pam_unix(sudo:session): session closed for user root
Oct 08 18:52:51 compute-0 python3.9[29447]: ansible-ansible.builtin.service_facts Invoked
Oct 08 18:52:51 compute-0 network[29464]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 08 18:52:51 compute-0 network[29465]: 'network-scripts' will be removed from distribution in near future.
Oct 08 18:52:51 compute-0 network[29466]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 08 18:52:56 compute-0 sudo[29728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xlispmrduqxvqzjxibtetdykrrvpomno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949575.768303-64-256238875711136/AnsiballZ_systemd_service.py'
Oct 08 18:52:56 compute-0 sudo[29728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:52:56 compute-0 python3.9[29730]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 08 18:52:56 compute-0 sudo[29728]: pam_unix(sudo:session): session closed for user root
Oct 08 18:52:56 compute-0 sudo[29881]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mywwerepuldiidiwdlwerpzlnzjyqofh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949576.585567-64-942383347930/AnsiballZ_systemd_service.py'
Oct 08 18:52:56 compute-0 sudo[29881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:52:57 compute-0 python3.9[29883]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 08 18:52:57 compute-0 sudo[29881]: pam_unix(sudo:session): session closed for user root
Oct 08 18:52:57 compute-0 sudo[30034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkycxbtxlbuaywmuoeeuxaiouczayefw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949577.4163384-64-146487077075140/AnsiballZ_systemd_service.py'
Oct 08 18:52:57 compute-0 sudo[30034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:52:58 compute-0 python3.9[30036]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 08 18:52:58 compute-0 sudo[30034]: pam_unix(sudo:session): session closed for user root
Oct 08 18:52:58 compute-0 sudo[30187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnxcgrzwxlsbqrgthvszrfntsnixyarf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949578.3088048-64-61992397225992/AnsiballZ_systemd_service.py'
Oct 08 18:52:58 compute-0 sudo[30187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:52:58 compute-0 python3.9[30189]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 08 18:53:00 compute-0 sudo[30187]: pam_unix(sudo:session): session closed for user root
Oct 08 18:53:00 compute-0 sudo[30340]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkjzfplxwjxezinrvzlsbkglgvmztffd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949580.1990795-64-7564336632420/AnsiballZ_systemd_service.py'
Oct 08 18:53:00 compute-0 sudo[30340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:53:00 compute-0 python3.9[30342]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 08 18:53:00 compute-0 sudo[30340]: pam_unix(sudo:session): session closed for user root
Oct 08 18:53:01 compute-0 sudo[30493]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmnavmbuwsnwtqzxxkhvisabztltnfrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949581.0633786-64-56837805572511/AnsiballZ_systemd_service.py'
Oct 08 18:53:01 compute-0 sudo[30493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:53:01 compute-0 python3.9[30495]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 08 18:53:01 compute-0 sudo[30493]: pam_unix(sudo:session): session closed for user root
Oct 08 18:53:02 compute-0 sudo[30646]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krtmyrlpyvzgjffkuoluqmyqphskkddr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949581.8472106-64-66523698173570/AnsiballZ_systemd_service.py'
Oct 08 18:53:02 compute-0 sudo[30646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:53:02 compute-0 python3.9[30648]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 08 18:53:02 compute-0 sudo[30646]: pam_unix(sudo:session): session closed for user root
Oct 08 18:53:03 compute-0 sudo[30799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujdquouebsljypypmsmsbwribdqjsjig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949582.7530024-116-141223539491166/AnsiballZ_file.py'
Oct 08 18:53:03 compute-0 sudo[30799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:53:03 compute-0 python3.9[30801]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:53:03 compute-0 sudo[30799]: pam_unix(sudo:session): session closed for user root
Oct 08 18:53:04 compute-0 sudo[30951]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iavaywmoupydafdvlmebwzmsfirrmxqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949583.68006-116-108884688899746/AnsiballZ_file.py'
Oct 08 18:53:04 compute-0 sudo[30951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:53:04 compute-0 python3.9[30953]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:53:04 compute-0 sudo[30951]: pam_unix(sudo:session): session closed for user root
Oct 08 18:53:05 compute-0 sudo[31103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aczyqunsgqzoflbinsraquopgvfnmmeo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949584.6629276-116-240384539349902/AnsiballZ_file.py'
Oct 08 18:53:05 compute-0 sudo[31103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:53:05 compute-0 python3.9[31105]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:53:05 compute-0 sudo[31103]: pam_unix(sudo:session): session closed for user root
Oct 08 18:53:05 compute-0 sudo[31255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utbpskmqwfocqcmgvxcvxusjvcysayxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949585.4160633-116-43876556440585/AnsiballZ_file.py'
Oct 08 18:53:05 compute-0 sudo[31255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:53:05 compute-0 python3.9[31257]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:53:05 compute-0 sudo[31255]: pam_unix(sudo:session): session closed for user root
Oct 08 18:53:06 compute-0 sudo[31407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mixpzphrhaenhbbfqsohnafwmsiddabh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949586.0477405-116-193455054141907/AnsiballZ_file.py'
Oct 08 18:53:06 compute-0 sudo[31407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:53:06 compute-0 python3.9[31409]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:53:06 compute-0 sudo[31407]: pam_unix(sudo:session): session closed for user root
Oct 08 18:53:07 compute-0 sudo[31559]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mzexbhtkrimnnyikkpqifzfsjslkbwqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949586.8640263-116-257072685224151/AnsiballZ_file.py'
Oct 08 18:53:07 compute-0 sudo[31559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:53:07 compute-0 python3.9[31561]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:53:07 compute-0 sudo[31559]: pam_unix(sudo:session): session closed for user root
Oct 08 18:53:07 compute-0 sudo[31711]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehyslliutkuwjowmxfflvoqbtuoribxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949587.6241524-116-204023830174732/AnsiballZ_file.py'
Oct 08 18:53:07 compute-0 sudo[31711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:53:08 compute-0 python3.9[31713]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:53:08 compute-0 sudo[31711]: pam_unix(sudo:session): session closed for user root
Oct 08 18:53:08 compute-0 sudo[31863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbdybjttzyxabjjphjlnhmqhpyoiteke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949588.3711226-166-208821445866133/AnsiballZ_file.py'
Oct 08 18:53:08 compute-0 sudo[31863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:53:08 compute-0 python3.9[31865]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:53:08 compute-0 sudo[31863]: pam_unix(sudo:session): session closed for user root
Oct 08 18:53:09 compute-0 sudo[32015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkyckmlekjhtehhtphpokihofvdkwinf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949589.0938015-166-262342232849377/AnsiballZ_file.py'
Oct 08 18:53:09 compute-0 sudo[32015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:53:09 compute-0 python3.9[32017]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:53:09 compute-0 sudo[32015]: pam_unix(sudo:session): session closed for user root
Oct 08 18:53:10 compute-0 sudo[32167]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zswerukovhcjxumkgsobwcsuzhfnqnbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949589.8167968-166-178545903520228/AnsiballZ_file.py'
Oct 08 18:53:10 compute-0 sudo[32167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:53:10 compute-0 python3.9[32169]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:53:10 compute-0 sudo[32167]: pam_unix(sudo:session): session closed for user root
Oct 08 18:53:11 compute-0 sudo[32330]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqzgubmneygnubarkrcumhbuuknmiald ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949590.5368736-166-51723187250571/AnsiballZ_file.py'
Oct 08 18:53:11 compute-0 sudo[32330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:53:11 compute-0 podman[32293]: 2025-10-08 18:53:11.053700088 +0000 UTC m=+0.095692829 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Oct 08 18:53:11 compute-0 python3.9[32334]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:53:11 compute-0 sudo[32330]: pam_unix(sudo:session): session closed for user root
Oct 08 18:53:11 compute-0 sudo[32503]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-guzarjqtcqmmvltmhjxqzuhfevheoocy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949591.3670015-166-80353548659315/AnsiballZ_file.py'
Oct 08 18:53:11 compute-0 sudo[32503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:53:11 compute-0 podman[32464]: 2025-10-08 18:53:11.788803789 +0000 UTC m=+0.133670070 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251001, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 08 18:53:11 compute-0 python3.9[32511]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:53:11 compute-0 sudo[32503]: pam_unix(sudo:session): session closed for user root
Oct 08 18:53:12 compute-0 sudo[32668]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfuqxjrlqqihuihspcapdqlwijasmwbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949592.1519408-166-159370602131813/AnsiballZ_file.py'
Oct 08 18:53:12 compute-0 sudo[32668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:53:12 compute-0 python3.9[32670]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:53:12 compute-0 sudo[32668]: pam_unix(sudo:session): session closed for user root
Oct 08 18:53:13 compute-0 sudo[32820]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypicfgzyjwxtwsuajcyulcfhvgqgptie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949592.9332492-166-126018483713867/AnsiballZ_file.py'
Oct 08 18:53:13 compute-0 sudo[32820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:53:13 compute-0 python3.9[32822]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:53:13 compute-0 sudo[32820]: pam_unix(sudo:session): session closed for user root
Oct 08 18:53:14 compute-0 sudo[32972]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynsckfacdoyvdqycjjukhodinvdwyktg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949593.7343602-217-63694384564723/AnsiballZ_command.py'
Oct 08 18:53:14 compute-0 sudo[32972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:53:14 compute-0 python3.9[32974]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                              systemctl disable --now certmonger.service
                                              test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                            fi
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 08 18:53:14 compute-0 sudo[32972]: pam_unix(sudo:session): session closed for user root
Oct 08 18:53:15 compute-0 python3.9[33126]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct 08 18:53:15 compute-0 sudo[33276]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrxjuxshxdzezcskokeeigegudnjxnfp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949595.5112643-235-208530194640275/AnsiballZ_systemd_service.py'
Oct 08 18:53:15 compute-0 sudo[33276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:53:16 compute-0 python3.9[33278]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 08 18:53:16 compute-0 systemd[1]: Reloading.
Oct 08 18:53:16 compute-0 systemd-rc-local-generator[33299]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 18:53:16 compute-0 systemd-sysv-generator[33308]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 18:53:16 compute-0 sudo[33276]: pam_unix(sudo:session): session closed for user root
Oct 08 18:53:17 compute-0 sudo[33463]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-psffijcioqcmkemaqqdcayzwgnfabxtm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949596.7246246-243-31668394861838/AnsiballZ_command.py'
Oct 08 18:53:17 compute-0 sudo[33463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:53:17 compute-0 python3.9[33465]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 08 18:53:17 compute-0 sudo[33463]: pam_unix(sudo:session): session closed for user root
Oct 08 18:53:17 compute-0 sudo[33616]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lpcehnfdvbnealwpkozgppayggkqkimn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949597.4500291-243-278316011598976/AnsiballZ_command.py'
Oct 08 18:53:17 compute-0 sudo[33616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:53:18 compute-0 python3.9[33618]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 08 18:53:18 compute-0 sudo[33616]: pam_unix(sudo:session): session closed for user root
Oct 08 18:53:18 compute-0 sudo[33769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sotbidhwyychnxdzqlfbagagxsrmfnxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949598.2281737-243-28245556181397/AnsiballZ_command.py'
Oct 08 18:53:18 compute-0 sudo[33769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:53:18 compute-0 python3.9[33771]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 08 18:53:18 compute-0 sudo[33769]: pam_unix(sudo:session): session closed for user root
Oct 08 18:53:19 compute-0 sudo[33922]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfzxbfnkjwjjjffjcvdmzxzpgwpzilnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949598.9902966-243-109271685321576/AnsiballZ_command.py'
Oct 08 18:53:19 compute-0 sudo[33922]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:53:19 compute-0 python3.9[33924]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 08 18:53:19 compute-0 sudo[33922]: pam_unix(sudo:session): session closed for user root
Oct 08 18:53:20 compute-0 sudo[34075]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tccppefohxpxuqjemczrekzjmyydocds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949599.7308905-243-264259471790287/AnsiballZ_command.py'
Oct 08 18:53:20 compute-0 sudo[34075]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:53:20 compute-0 python3.9[34077]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 08 18:53:20 compute-0 sudo[34075]: pam_unix(sudo:session): session closed for user root
Oct 08 18:53:20 compute-0 sudo[34228]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfjamxpehzwuyeddhdmxkzbmenjnrkhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949600.4540844-243-196751253093630/AnsiballZ_command.py'
Oct 08 18:53:20 compute-0 sudo[34228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:53:21 compute-0 python3.9[34230]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 08 18:53:21 compute-0 sudo[34228]: pam_unix(sudo:session): session closed for user root
Oct 08 18:53:21 compute-0 sudo[34381]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfclpayzqjorjafumkhamejsrerktoeh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949601.2601016-243-219497163570417/AnsiballZ_command.py'
Oct 08 18:53:21 compute-0 sudo[34381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:53:21 compute-0 python3.9[34383]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 08 18:53:21 compute-0 sudo[34381]: pam_unix(sudo:session): session closed for user root
Oct 08 18:53:22 compute-0 sudo[34534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbybeyqxvazpwbsayfqqnshrltonswmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949602.2157395-297-886047036510/AnsiballZ_getent.py'
Oct 08 18:53:22 compute-0 sudo[34534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:53:22 compute-0 python3.9[34536]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Oct 08 18:53:22 compute-0 sudo[34534]: pam_unix(sudo:session): session closed for user root
Oct 08 18:53:23 compute-0 sudo[34687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bztfaitrtkrzurotpfckzfzmecgzrcni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949603.198393-305-45237501610363/AnsiballZ_group.py'
Oct 08 18:53:23 compute-0 sudo[34687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:53:23 compute-0 python3.9[34689]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct 08 18:53:23 compute-0 groupadd[34690]: group added to /etc/group: name=libvirt, GID=42473
Oct 08 18:53:24 compute-0 groupadd[34690]: group added to /etc/gshadow: name=libvirt
Oct 08 18:53:24 compute-0 groupadd[34690]: new group: name=libvirt, GID=42473
Oct 08 18:53:24 compute-0 sudo[34687]: pam_unix(sudo:session): session closed for user root
Oct 08 18:53:24 compute-0 sudo[34845]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdjuywkfwrpxexgenjfmlboalcmhbwap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949604.2668586-313-243621836575205/AnsiballZ_user.py'
Oct 08 18:53:24 compute-0 sudo[34845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:53:25 compute-0 python3.9[34847]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct 08 18:53:25 compute-0 useradd[34849]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Oct 08 18:53:25 compute-0 sudo[34845]: pam_unix(sudo:session): session closed for user root
Oct 08 18:53:25 compute-0 sudo[35005]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orznkayaqwqzehpjmypaalxrmoyaysgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949605.6116605-324-201442767273640/AnsiballZ_setup.py'
Oct 08 18:53:25 compute-0 sudo[35005]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:53:26 compute-0 python3.9[35007]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 08 18:53:26 compute-0 sudo[35005]: pam_unix(sudo:session): session closed for user root
Oct 08 18:53:27 compute-0 sudo[35089]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kcogmpkojevdrklmlscbejkblunkyngf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949605.6116605-324-201442767273640/AnsiballZ_dnf.py'
Oct 08 18:53:27 compute-0 sudo[35089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:53:27 compute-0 python3.9[35091]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 08 18:53:41 compute-0 podman[35240]: 2025-10-08 18:53:41.679212269 +0000 UTC m=+0.080613400 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct 08 18:53:42 compute-0 podman[35287]: 2025-10-08 18:53:42.743667235 +0000 UTC m=+0.160106964 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 08 18:53:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:53:44.211 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 18:53:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:53:44.212 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 18:53:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:53:44.212 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 18:53:57 compute-0 kernel: SELinux:  Converting 429 SID table entries...
Oct 08 18:53:57 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Oct 08 18:53:57 compute-0 kernel: SELinux:  policy capability open_perms=1
Oct 08 18:53:57 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Oct 08 18:53:57 compute-0 kernel: SELinux:  policy capability always_check_network=0
Oct 08 18:53:57 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 08 18:53:57 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 08 18:53:57 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 08 18:54:06 compute-0 kernel: SELinux:  Converting 429 SID table entries...
Oct 08 18:54:06 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Oct 08 18:54:06 compute-0 kernel: SELinux:  policy capability open_perms=1
Oct 08 18:54:06 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Oct 08 18:54:06 compute-0 kernel: SELinux:  policy capability always_check_network=0
Oct 08 18:54:06 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 08 18:54:06 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 08 18:54:06 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 08 18:54:12 compute-0 dbus-broker-launch[836]: avc:  op=load_policy lsm=selinux seqno=4 res=1
Oct 08 18:54:12 compute-0 podman[35348]: 2025-10-08 18:54:12.673087206 +0000 UTC m=+0.079712062 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct 08 18:54:13 compute-0 podman[35367]: 2025-10-08 18:54:13.748459586 +0000 UTC m=+0.161262780 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3)
Oct 08 18:54:43 compute-0 podman[47860]: 2025-10-08 18:54:43.655207006 +0000 UTC m=+0.063780487 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Oct 08 18:54:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:54:44.212 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 18:54:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:54:44.213 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 18:54:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:54:44.213 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 18:54:44 compute-0 podman[48381]: 2025-10-08 18:54:44.686089339 +0000 UTC m=+0.106843647 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct 08 18:55:04 compute-0 kernel: SELinux:  Converting 430 SID table entries...
Oct 08 18:55:04 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Oct 08 18:55:04 compute-0 kernel: SELinux:  policy capability open_perms=1
Oct 08 18:55:04 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Oct 08 18:55:04 compute-0 kernel: SELinux:  policy capability always_check_network=0
Oct 08 18:55:04 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 08 18:55:04 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 08 18:55:04 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 08 18:55:05 compute-0 groupadd[52197]: group added to /etc/group: name=dnsmasq, GID=992
Oct 08 18:55:05 compute-0 groupadd[52197]: group added to /etc/gshadow: name=dnsmasq
Oct 08 18:55:05 compute-0 groupadd[52197]: new group: name=dnsmasq, GID=992
Oct 08 18:55:05 compute-0 useradd[52204]: new user: name=dnsmasq, UID=992, GID=992, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Oct 08 18:55:05 compute-0 dbus-broker-launch[835]: Noticed file-system modification, trigger reload.
Oct 08 18:55:05 compute-0 dbus-broker-launch[836]: avc:  op=load_policy lsm=selinux seqno=5 res=1
Oct 08 18:55:05 compute-0 dbus-broker-launch[835]: Noticed file-system modification, trigger reload.
Oct 08 18:55:06 compute-0 groupadd[52217]: group added to /etc/group: name=clevis, GID=991
Oct 08 18:55:06 compute-0 groupadd[52217]: group added to /etc/gshadow: name=clevis
Oct 08 18:55:06 compute-0 groupadd[52217]: new group: name=clevis, GID=991
Oct 08 18:55:06 compute-0 useradd[52224]: new user: name=clevis, UID=991, GID=991, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Oct 08 18:55:06 compute-0 usermod[52234]: add 'clevis' to group 'tss'
Oct 08 18:55:06 compute-0 usermod[52234]: add 'clevis' to shadow group 'tss'
Oct 08 18:55:08 compute-0 polkitd[1188]: Reloading rules
Oct 08 18:55:08 compute-0 polkitd[1188]: Collecting garbage unconditionally...
Oct 08 18:55:08 compute-0 polkitd[1188]: Loading rules from directory /etc/polkit-1/rules.d
Oct 08 18:55:08 compute-0 polkitd[1188]: Loading rules from directory /usr/share/polkit-1/rules.d
Oct 08 18:55:08 compute-0 polkitd[1188]: Finished loading, compiling and executing 4 rules
Oct 08 18:55:08 compute-0 polkitd[1188]: Reloading rules
Oct 08 18:55:08 compute-0 polkitd[1188]: Collecting garbage unconditionally...
Oct 08 18:55:08 compute-0 polkitd[1188]: Loading rules from directory /etc/polkit-1/rules.d
Oct 08 18:55:08 compute-0 polkitd[1188]: Loading rules from directory /usr/share/polkit-1/rules.d
Oct 08 18:55:08 compute-0 polkitd[1188]: Finished loading, compiling and executing 4 rules
Oct 08 18:55:10 compute-0 groupadd[52421]: group added to /etc/group: name=ceph, GID=167
Oct 08 18:55:10 compute-0 groupadd[52421]: group added to /etc/gshadow: name=ceph
Oct 08 18:55:10 compute-0 groupadd[52421]: new group: name=ceph, GID=167
Oct 08 18:55:10 compute-0 useradd[52427]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Oct 08 18:55:13 compute-0 systemd[1]: Stopping OpenSSH server daemon...
Oct 08 18:55:13 compute-0 sshd[1289]: Received signal 15; terminating.
Oct 08 18:55:13 compute-0 systemd[1]: sshd.service: Deactivated successfully.
Oct 08 18:55:13 compute-0 systemd[1]: Stopped OpenSSH server daemon.
Oct 08 18:55:13 compute-0 systemd[1]: Stopped target sshd-keygen.target.
Oct 08 18:55:13 compute-0 systemd[1]: Stopping sshd-keygen.target...
Oct 08 18:55:13 compute-0 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 08 18:55:13 compute-0 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 08 18:55:13 compute-0 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 08 18:55:13 compute-0 systemd[1]: Reached target sshd-keygen.target.
Oct 08 18:55:13 compute-0 systemd[1]: Starting OpenSSH server daemon...
Oct 08 18:55:13 compute-0 sshd[52930]: Server listening on 0.0.0.0 port 22.
Oct 08 18:55:13 compute-0 sshd[52930]: Server listening on :: port 22.
Oct 08 18:55:13 compute-0 systemd[1]: Started OpenSSH server daemon.
Oct 08 18:55:13 compute-0 podman[52973]: 2025-10-08 18:55:13.810598302 +0000 UTC m=+0.097082504 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct 08 18:55:14 compute-0 podman[53105]: 2025-10-08 18:55:14.888327658 +0000 UTC m=+0.121141829 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible)
Oct 08 18:55:15 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 08 18:55:15 compute-0 systemd[1]: Starting man-db-cache-update.service...
Oct 08 18:55:15 compute-0 systemd[1]: Reloading.
Oct 08 18:55:15 compute-0 systemd-rc-local-generator[53231]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 18:55:15 compute-0 systemd-sysv-generator[53236]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 18:55:16 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Oct 08 18:55:19 compute-0 systemd[1]: Starting PackageKit Daemon...
Oct 08 18:55:19 compute-0 PackageKit[55288]: daemon start
Oct 08 18:55:19 compute-0 systemd[1]: Started PackageKit Daemon.
Oct 08 18:55:19 compute-0 sudo[35089]: pam_unix(sudo:session): session closed for user root
Oct 08 18:55:20 compute-0 sudo[56750]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agblczlhltuqodbpxzzifvbpfidihumk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949719.7772486-336-129248927900419/AnsiballZ_systemd.py'
Oct 08 18:55:20 compute-0 sudo[56750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:55:20 compute-0 python3.9[56781]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 08 18:55:20 compute-0 systemd[1]: Reloading.
Oct 08 18:55:20 compute-0 systemd-rc-local-generator[57153]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 18:55:20 compute-0 systemd-sysv-generator[57162]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 18:55:21 compute-0 sudo[56750]: pam_unix(sudo:session): session closed for user root
Oct 08 18:55:21 compute-0 sudo[57838]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hylvmkmcryelassgzzlvhrmlvmglybmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949721.291427-336-15444028561106/AnsiballZ_systemd.py'
Oct 08 18:55:21 compute-0 sudo[57838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:55:21 compute-0 python3.9[57857]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 08 18:55:22 compute-0 systemd[1]: Reloading.
Oct 08 18:55:22 compute-0 systemd-rc-local-generator[58253]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 18:55:22 compute-0 systemd-sysv-generator[58259]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 18:55:22 compute-0 sudo[57838]: pam_unix(sudo:session): session closed for user root
Oct 08 18:55:22 compute-0 sudo[58918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtdnesaqkdfsigffmwwynzkfzgkhmbgg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949722.43962-336-236016711549972/AnsiballZ_systemd.py'
Oct 08 18:55:22 compute-0 sudo[58918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:55:23 compute-0 python3.9[58936]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 08 18:55:23 compute-0 systemd[1]: Reloading.
Oct 08 18:55:23 compute-0 systemd-rc-local-generator[59324]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 18:55:23 compute-0 systemd-sysv-generator[59328]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 18:55:23 compute-0 sudo[58918]: pam_unix(sudo:session): session closed for user root
Oct 08 18:55:23 compute-0 sudo[59973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahgunimryqzeugluhgeqptxckyjckezx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949723.62588-336-278531113196809/AnsiballZ_systemd.py'
Oct 08 18:55:23 compute-0 sudo[59973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:55:24 compute-0 python3.9[59994]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 08 18:55:24 compute-0 systemd[1]: Reloading.
Oct 08 18:55:24 compute-0 systemd-rc-local-generator[60384]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 18:55:24 compute-0 systemd-sysv-generator[60387]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 18:55:24 compute-0 sudo[59973]: pam_unix(sudo:session): session closed for user root
Oct 08 18:55:25 compute-0 sudo[61195]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frbfteatjmqkwdypfhqfempcaixdowbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949724.8587842-365-173525875251220/AnsiballZ_systemd.py'
Oct 08 18:55:25 compute-0 sudo[61195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:55:25 compute-0 python3.9[61216]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 08 18:55:25 compute-0 systemd[1]: Reloading.
Oct 08 18:55:25 compute-0 systemd-rc-local-generator[61649]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 18:55:25 compute-0 systemd-sysv-generator[61657]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 18:55:25 compute-0 sudo[61195]: pam_unix(sudo:session): session closed for user root
Oct 08 18:55:26 compute-0 sudo[62303]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-inkwkkuqyklglnxkauatpqgmekiyvdfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949726.023868-365-16731499782470/AnsiballZ_systemd.py'
Oct 08 18:55:26 compute-0 sudo[62303]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:55:26 compute-0 python3.9[62327]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 08 18:55:26 compute-0 systemd[1]: Reloading.
Oct 08 18:55:26 compute-0 systemd-rc-local-generator[62613]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 18:55:26 compute-0 systemd-sysv-generator[62616]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 18:55:27 compute-0 sudo[62303]: pam_unix(sudo:session): session closed for user root
Oct 08 18:55:27 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 08 18:55:27 compute-0 systemd[1]: Finished man-db-cache-update.service.
Oct 08 18:55:27 compute-0 systemd[1]: man-db-cache-update.service: Consumed 13.141s CPU time.
Oct 08 18:55:27 compute-0 systemd[1]: run-r92d085bfb7384d62b55786ab9549f683.service: Deactivated successfully.
Oct 08 18:55:27 compute-0 sudo[62771]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvmqoomdyofvrjqdzqrofwpkffxfygsw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949727.2183886-365-163805840699751/AnsiballZ_systemd.py'
Oct 08 18:55:27 compute-0 sudo[62771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:55:27 compute-0 python3.9[62773]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 08 18:55:28 compute-0 systemd[1]: Reloading.
Oct 08 18:55:28 compute-0 systemd-rc-local-generator[62799]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 18:55:28 compute-0 systemd-sysv-generator[62804]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 18:55:28 compute-0 sudo[62771]: pam_unix(sudo:session): session closed for user root
Oct 08 18:55:28 compute-0 sudo[62961]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdzdrzhwbnpifcynlyrjknqzsqpctngu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949728.5188355-365-165230944664448/AnsiballZ_systemd.py'
Oct 08 18:55:28 compute-0 sudo[62961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:55:29 compute-0 python3.9[62963]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 08 18:55:29 compute-0 sudo[62961]: pam_unix(sudo:session): session closed for user root
Oct 08 18:55:29 compute-0 sudo[63116]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkpjlejgvstsmjbzuuezpgauhzwayshr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949729.5117621-365-254116160070095/AnsiballZ_systemd.py'
Oct 08 18:55:29 compute-0 sudo[63116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:55:30 compute-0 python3.9[63118]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 08 18:55:30 compute-0 systemd[1]: Reloading.
Oct 08 18:55:30 compute-0 systemd-rc-local-generator[63149]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 18:55:30 compute-0 systemd-sysv-generator[63152]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 18:55:30 compute-0 sudo[63116]: pam_unix(sudo:session): session closed for user root
Oct 08 18:55:31 compute-0 sudo[63306]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjelupmqshliohuueisaiipqphuazuma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949730.7081668-401-278042091368814/AnsiballZ_systemd.py'
Oct 08 18:55:31 compute-0 sudo[63306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:55:31 compute-0 python3.9[63308]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 08 18:55:31 compute-0 systemd[1]: Reloading.
Oct 08 18:55:31 compute-0 systemd-rc-local-generator[63340]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 18:55:31 compute-0 systemd-sysv-generator[63343]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 18:55:31 compute-0 systemd[1]: Listening on libvirt proxy daemon socket.
Oct 08 18:55:31 compute-0 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Oct 08 18:55:31 compute-0 sudo[63306]: pam_unix(sudo:session): session closed for user root
Oct 08 18:55:32 compute-0 sudo[63500]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldxtqdsmesnvxsbyepnsnteiyyzdmgab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949731.9775472-409-41692773394969/AnsiballZ_systemd.py'
Oct 08 18:55:32 compute-0 sudo[63500]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:55:32 compute-0 python3.9[63502]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 08 18:55:32 compute-0 sudo[63500]: pam_unix(sudo:session): session closed for user root
Oct 08 18:55:33 compute-0 sudo[63655]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mychulhkbvqbuobvaqttsrbqgutjyrhp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949732.9595122-409-254313872117470/AnsiballZ_systemd.py'
Oct 08 18:55:33 compute-0 sudo[63655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:55:33 compute-0 python3.9[63657]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 08 18:55:33 compute-0 sudo[63655]: pam_unix(sudo:session): session closed for user root
Oct 08 18:55:34 compute-0 sudo[63810]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tywnwrbnayxhtnurnathnjablkruoizp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949733.8868155-409-67331049664615/AnsiballZ_systemd.py'
Oct 08 18:55:34 compute-0 sudo[63810]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:55:34 compute-0 python3.9[63812]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 08 18:55:34 compute-0 sudo[63810]: pam_unix(sudo:session): session closed for user root
Oct 08 18:55:35 compute-0 sudo[63965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wesvnupdhloueobtlmmjusrwzqvxtsqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949734.82117-409-53899255966908/AnsiballZ_systemd.py'
Oct 08 18:55:35 compute-0 sudo[63965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:55:35 compute-0 python3.9[63967]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 08 18:55:35 compute-0 sudo[63965]: pam_unix(sudo:session): session closed for user root
Oct 08 18:55:36 compute-0 sudo[64120]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dovmemoudxadczmxyctoufatdrayqedg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949735.7683995-409-200181086319427/AnsiballZ_systemd.py'
Oct 08 18:55:36 compute-0 sudo[64120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:55:36 compute-0 python3.9[64122]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 08 18:55:36 compute-0 sudo[64120]: pam_unix(sudo:session): session closed for user root
Oct 08 18:55:37 compute-0 sudo[64275]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kneuwmtdryfbusyfjzvnokuewoxyztfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949736.7887926-409-139023035799801/AnsiballZ_systemd.py'
Oct 08 18:55:37 compute-0 sudo[64275]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:55:37 compute-0 python3.9[64277]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 08 18:55:37 compute-0 sudo[64275]: pam_unix(sudo:session): session closed for user root
Oct 08 18:55:38 compute-0 sudo[64430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-virfqhixcfdwydakclaphswerhvmualo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949737.7840743-409-163883813811316/AnsiballZ_systemd.py'
Oct 08 18:55:38 compute-0 sudo[64430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:55:38 compute-0 python3.9[64432]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 08 18:55:38 compute-0 sudo[64430]: pam_unix(sudo:session): session closed for user root
Oct 08 18:55:39 compute-0 sudo[64585]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utblrvcyljyobrrqxsqpzxqnrphbemdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949738.6529975-409-153053347453219/AnsiballZ_systemd.py'
Oct 08 18:55:39 compute-0 sudo[64585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:55:39 compute-0 python3.9[64587]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 08 18:55:39 compute-0 sudo[64585]: pam_unix(sudo:session): session closed for user root
Oct 08 18:55:39 compute-0 systemd[1314]: Created slice User Background Tasks Slice.
Oct 08 18:55:39 compute-0 systemd[1314]: Starting Cleanup of User's Temporary Files and Directories...
Oct 08 18:55:39 compute-0 systemd[1314]: Finished Cleanup of User's Temporary Files and Directories.
Oct 08 18:55:39 compute-0 sudo[64741]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdyslbblemdayvidikichvbjniiqvyfl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949739.6172829-409-110375465797357/AnsiballZ_systemd.py'
Oct 08 18:55:39 compute-0 sudo[64741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:55:40 compute-0 python3.9[64743]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 08 18:55:40 compute-0 sudo[64741]: pam_unix(sudo:session): session closed for user root
Oct 08 18:55:40 compute-0 sudo[64896]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdjcniydggansuphjvczhzmtptgkppml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949740.5017834-409-144491753571484/AnsiballZ_systemd.py'
Oct 08 18:55:40 compute-0 sudo[64896]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:55:41 compute-0 python3.9[64898]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 08 18:55:41 compute-0 sudo[64896]: pam_unix(sudo:session): session closed for user root
Oct 08 18:55:41 compute-0 sudo[65051]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlissmvmrxmalwcsolvvubxkazejchau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949741.3494263-409-51560395573393/AnsiballZ_systemd.py'
Oct 08 18:55:41 compute-0 sudo[65051]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:55:41 compute-0 python3.9[65053]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 08 18:55:42 compute-0 sudo[65051]: pam_unix(sudo:session): session closed for user root
Oct 08 18:55:42 compute-0 sudo[65206]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxucqxzkzwucxyhpbazlxhgixqnfuhpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949742.1860702-409-225230966165852/AnsiballZ_systemd.py'
Oct 08 18:55:42 compute-0 sudo[65206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:55:42 compute-0 python3.9[65208]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 08 18:55:42 compute-0 sudo[65206]: pam_unix(sudo:session): session closed for user root
Oct 08 18:55:43 compute-0 sudo[65361]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbxiiodjtlrcmxisbmobfvbffqpghhby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949743.0953336-409-189378597029387/AnsiballZ_systemd.py'
Oct 08 18:55:43 compute-0 sudo[65361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:55:43 compute-0 python3.9[65363]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 08 18:55:43 compute-0 sudo[65361]: pam_unix(sudo:session): session closed for user root
Oct 08 18:55:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:55:44.213 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 18:55:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:55:44.214 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 18:55:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:55:44.215 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 18:55:44 compute-0 podman[65490]: 2025-10-08 18:55:44.383477388 +0000 UTC m=+0.066556675 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001)
Oct 08 18:55:44 compute-0 sudo[65535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvffzbepyoakuhiizzmxkhbgvmqqyatg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949743.990119-409-169455089920879/AnsiballZ_systemd.py'
Oct 08 18:55:44 compute-0 sudo[65535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:55:44 compute-0 python3.9[65537]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 08 18:55:44 compute-0 sudo[65535]: pam_unix(sudo:session): session closed for user root
Oct 08 18:55:45 compute-0 sudo[65706]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyotzatqnsgstjbpfdgoscyxlqoqxqem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949745.1667042-511-90672098119973/AnsiballZ_file.py'
Oct 08 18:55:45 compute-0 sudo[65706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:55:45 compute-0 podman[65664]: 2025-10-08 18:55:45.537146004 +0000 UTC m=+0.097928709 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 08 18:55:45 compute-0 python3.9[65714]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:55:45 compute-0 sudo[65706]: pam_unix(sudo:session): session closed for user root
Oct 08 18:55:46 compute-0 sudo[65869]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unabutlhypsiducolszpnfdnubwbguiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949745.9133415-511-117076986363526/AnsiballZ_file.py'
Oct 08 18:55:46 compute-0 sudo[65869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:55:46 compute-0 python3.9[65871]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:55:46 compute-0 sudo[65869]: pam_unix(sudo:session): session closed for user root
Oct 08 18:55:46 compute-0 sudo[66021]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fyorbflphanyxwwjyozhbseyxscvojdu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949746.6150696-511-49046989135959/AnsiballZ_file.py'
Oct 08 18:55:46 compute-0 sudo[66021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:55:47 compute-0 python3.9[66023]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:55:47 compute-0 sudo[66021]: pam_unix(sudo:session): session closed for user root
Oct 08 18:55:47 compute-0 sudo[66173]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgzciojzjtraswpnoxgxwjpqhksfxulk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949747.3406298-511-268628370252029/AnsiballZ_file.py'
Oct 08 18:55:47 compute-0 sudo[66173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:55:47 compute-0 python3.9[66175]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:55:47 compute-0 sudo[66173]: pam_unix(sudo:session): session closed for user root
Oct 08 18:55:48 compute-0 sudo[66325]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-feoehcdxjelyyqwehvpadyxckbkmdqsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949748.1444657-511-229372733237823/AnsiballZ_file.py'
Oct 08 18:55:48 compute-0 sudo[66325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:55:48 compute-0 python3.9[66327]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:55:48 compute-0 sudo[66325]: pam_unix(sudo:session): session closed for user root
Oct 08 18:55:49 compute-0 sudo[66477]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gatxyflvkpoizjdnexhwawqimderwuhg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949748.8427835-511-139081548221887/AnsiballZ_file.py'
Oct 08 18:55:49 compute-0 sudo[66477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:55:49 compute-0 python3.9[66479]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:55:49 compute-0 sudo[66477]: pam_unix(sudo:session): session closed for user root
Oct 08 18:55:50 compute-0 sudo[66629]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lsxjlnclkxwctjqyhfaoiffwkvvfdhnv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949749.6356785-554-3698774746160/AnsiballZ_stat.py'
Oct 08 18:55:50 compute-0 sudo[66629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:55:50 compute-0 python3.9[66631]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:55:50 compute-0 sudo[66629]: pam_unix(sudo:session): session closed for user root
Oct 08 18:55:51 compute-0 sudo[66754]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kcciroqalnomeuokhxarjkyqhujvyikz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949749.6356785-554-3698774746160/AnsiballZ_copy.py'
Oct 08 18:55:51 compute-0 sudo[66754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:55:51 compute-0 python3.9[66756]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759949749.6356785-554-3698774746160/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:55:51 compute-0 sudo[66754]: pam_unix(sudo:session): session closed for user root
Oct 08 18:55:51 compute-0 sudo[66906]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yspefartqblmvkhvugzdypzyoyyexkio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949751.4076536-554-212111276544674/AnsiballZ_stat.py'
Oct 08 18:55:51 compute-0 sudo[66906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:55:51 compute-0 python3.9[66908]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:55:52 compute-0 sudo[66906]: pam_unix(sudo:session): session closed for user root
Oct 08 18:55:52 compute-0 sudo[67031]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gywcrhccndmefficagbiwymgwiuneqpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949751.4076536-554-212111276544674/AnsiballZ_copy.py'
Oct 08 18:55:52 compute-0 sudo[67031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:55:52 compute-0 python3.9[67033]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759949751.4076536-554-212111276544674/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:55:52 compute-0 sudo[67031]: pam_unix(sudo:session): session closed for user root
Oct 08 18:55:53 compute-0 sudo[67183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifwplmgemwnvnnsgezaelquyozbuhhdb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949752.837564-554-96647499880372/AnsiballZ_stat.py'
Oct 08 18:55:53 compute-0 sudo[67183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:55:53 compute-0 python3.9[67185]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:55:53 compute-0 sudo[67183]: pam_unix(sudo:session): session closed for user root
Oct 08 18:55:53 compute-0 sudo[67308]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urowfgfvdwocirznsmbenlxhenzckzgg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949752.837564-554-96647499880372/AnsiballZ_copy.py'
Oct 08 18:55:53 compute-0 sudo[67308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:55:54 compute-0 python3.9[67310]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759949752.837564-554-96647499880372/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:55:54 compute-0 sudo[67308]: pam_unix(sudo:session): session closed for user root
Oct 08 18:55:54 compute-0 sudo[67460]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spghltfinptnyedfltztpsitxqzncmhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949754.2665875-554-183634025307668/AnsiballZ_stat.py'
Oct 08 18:55:54 compute-0 sudo[67460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:55:54 compute-0 python3.9[67462]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:55:54 compute-0 sudo[67460]: pam_unix(sudo:session): session closed for user root
Oct 08 18:55:55 compute-0 sudo[67585]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-beicdbwwpwdbazjhqesolyipyuriffio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949754.2665875-554-183634025307668/AnsiballZ_copy.py'
Oct 08 18:55:55 compute-0 sudo[67585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:55:55 compute-0 python3.9[67587]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759949754.2665875-554-183634025307668/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:55:55 compute-0 sudo[67585]: pam_unix(sudo:session): session closed for user root
Oct 08 18:55:56 compute-0 sudo[67737]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bilutchtngvkkuagwtovkwuzlbzdnkdk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949755.7638974-554-241627026644268/AnsiballZ_stat.py'
Oct 08 18:55:56 compute-0 sudo[67737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:55:56 compute-0 python3.9[67739]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:55:56 compute-0 sudo[67737]: pam_unix(sudo:session): session closed for user root
Oct 08 18:55:56 compute-0 sudo[67862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpsixbminlxzlrbqxcecfiskckikiots ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949755.7638974-554-241627026644268/AnsiballZ_copy.py'
Oct 08 18:55:56 compute-0 sudo[67862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:55:57 compute-0 python3.9[67864]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759949755.7638974-554-241627026644268/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:55:57 compute-0 sudo[67862]: pam_unix(sudo:session): session closed for user root
Oct 08 18:55:57 compute-0 sudo[68014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvrlfrvqljuzubatsumamplarjevtvke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949757.225823-554-105428098353981/AnsiballZ_stat.py'
Oct 08 18:55:57 compute-0 sudo[68014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:55:57 compute-0 python3.9[68016]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:55:57 compute-0 sudo[68014]: pam_unix(sudo:session): session closed for user root
Oct 08 18:55:58 compute-0 sudo[68139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-goegcnrcxlmpubvwvlhnzomkremfhteb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949757.225823-554-105428098353981/AnsiballZ_copy.py'
Oct 08 18:55:58 compute-0 sudo[68139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:55:58 compute-0 python3.9[68141]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759949757.225823-554-105428098353981/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:55:58 compute-0 sudo[68139]: pam_unix(sudo:session): session closed for user root
Oct 08 18:55:58 compute-0 sudo[68291]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlxpakyrjpkiamwrxupehxbfzebxhwfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949758.588169-554-224101721322219/AnsiballZ_stat.py'
Oct 08 18:55:58 compute-0 sudo[68291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:55:59 compute-0 python3.9[68293]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:55:59 compute-0 sudo[68291]: pam_unix(sudo:session): session closed for user root
Oct 08 18:55:59 compute-0 sudo[68414]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efxnbomfnsszbxlkfndimnpojnslidnb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949758.588169-554-224101721322219/AnsiballZ_copy.py'
Oct 08 18:55:59 compute-0 sudo[68414]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:55:59 compute-0 python3.9[68416]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759949758.588169-554-224101721322219/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:55:59 compute-0 sudo[68414]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:00 compute-0 sudo[68566]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxriruffezylnvivctekbflppoukqxyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949759.906468-554-177064624935986/AnsiballZ_stat.py'
Oct 08 18:56:00 compute-0 sudo[68566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:00 compute-0 python3.9[68568]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:56:00 compute-0 sudo[68566]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:00 compute-0 sudo[68691]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imffqdtsiulhhrfrrmdvibjvtzffvhdf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949759.906468-554-177064624935986/AnsiballZ_copy.py'
Oct 08 18:56:00 compute-0 sudo[68691]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:01 compute-0 python3.9[68693]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759949759.906468-554-177064624935986/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:56:01 compute-0 sudo[68691]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:01 compute-0 sudo[68843]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oifhnboqaxkmsrjeiwgbcnjepjcwcsrt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949761.3629959-667-13039165212957/AnsiballZ_command.py'
Oct 08 18:56:01 compute-0 sudo[68843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:01 compute-0 python3.9[68845]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Oct 08 18:56:01 compute-0 sudo[68843]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:02 compute-0 sudo[68996]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vndzypikbjhpakhhgkyxvztearrsufny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949762.169516-676-228266232460317/AnsiballZ_file.py'
Oct 08 18:56:02 compute-0 sudo[68996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:02 compute-0 python3.9[68998]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:56:02 compute-0 sudo[68996]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:03 compute-0 sudo[69148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-peitdiidwqngfmbzswrslxolfzsayiil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949762.967767-676-170535941636869/AnsiballZ_file.py'
Oct 08 18:56:03 compute-0 sudo[69148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:03 compute-0 python3.9[69150]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:56:03 compute-0 sudo[69148]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:03 compute-0 sudo[69300]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfvavzwzatlaysyqkdyrpqkhxhcyjdxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949763.6871445-676-44259224897865/AnsiballZ_file.py'
Oct 08 18:56:03 compute-0 sudo[69300]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:04 compute-0 python3.9[69302]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:56:04 compute-0 sudo[69300]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:04 compute-0 sudo[69452]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-seudvuqfmqsauiftajjmdhqatlrbntyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949764.3650565-676-125042684136615/AnsiballZ_file.py'
Oct 08 18:56:04 compute-0 sudo[69452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:04 compute-0 python3.9[69454]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:56:04 compute-0 sudo[69452]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:05 compute-0 sudo[69604]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjttobiamxgxwzettdduryelohfzrvhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949765.0424519-676-252079952918718/AnsiballZ_file.py'
Oct 08 18:56:05 compute-0 sudo[69604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:05 compute-0 python3.9[69606]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:56:05 compute-0 sudo[69604]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:06 compute-0 sudo[69756]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfzoxxedkkiwpynpuixtwcfozqaoxixc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949765.7233899-676-165882544594654/AnsiballZ_file.py'
Oct 08 18:56:06 compute-0 sudo[69756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:06 compute-0 python3.9[69758]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:56:06 compute-0 sudo[69756]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:06 compute-0 sudo[69908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kejywotwzuulhiygxyfzdeqzmpvzkcnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949766.4433024-676-12140822130108/AnsiballZ_file.py'
Oct 08 18:56:06 compute-0 sudo[69908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:07 compute-0 python3.9[69910]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:56:07 compute-0 sudo[69908]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:07 compute-0 sudo[70060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqhsqehdbgvovcbztofgkrszlibgdjck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949767.2482755-676-148594300946444/AnsiballZ_file.py'
Oct 08 18:56:07 compute-0 sudo[70060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:07 compute-0 python3.9[70062]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:56:07 compute-0 sudo[70060]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:08 compute-0 sudo[70212]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-haxuojxsxcbupjognumtvuntacykvjqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949767.9852552-676-112234620025127/AnsiballZ_file.py'
Oct 08 18:56:08 compute-0 sudo[70212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:08 compute-0 python3.9[70214]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:56:08 compute-0 sudo[70212]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:09 compute-0 sudo[70364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enyvuiybkpgrlexpshidgxzayfgsmvep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949768.7931044-676-114650796438611/AnsiballZ_file.py'
Oct 08 18:56:09 compute-0 sudo[70364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:09 compute-0 python3.9[70366]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:56:09 compute-0 sudo[70364]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:09 compute-0 sudo[70516]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmbrsljudwlrotnnsjbwbsmrxbegzutr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949769.5750303-676-116596803744032/AnsiballZ_file.py'
Oct 08 18:56:09 compute-0 sudo[70516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:10 compute-0 python3.9[70518]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:56:10 compute-0 sudo[70516]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:10 compute-0 sudo[70668]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plmvexhqbfgzjyrccqdscxuxqgqhykzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949770.3089247-676-194222401322720/AnsiballZ_file.py'
Oct 08 18:56:10 compute-0 sudo[70668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:10 compute-0 python3.9[70670]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:56:10 compute-0 sudo[70668]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:11 compute-0 sudo[70820]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gaclplhyekqqpdzyeltlhkgedwqwlizm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949771.0457811-676-23267563651340/AnsiballZ_file.py'
Oct 08 18:56:11 compute-0 sudo[70820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:11 compute-0 python3.9[70822]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:56:11 compute-0 sudo[70820]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:12 compute-0 sudo[70972]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzxdxhtfdohqijyorvphhjrhvuugxzib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949771.8142421-676-226149537782720/AnsiballZ_file.py'
Oct 08 18:56:12 compute-0 sudo[70972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:12 compute-0 python3.9[70974]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:56:12 compute-0 sudo[70972]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:12 compute-0 sudo[71124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycjvoywkxtwejvrttarhybrqnqrrdwxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949772.6024625-775-64848924783671/AnsiballZ_stat.py'
Oct 08 18:56:12 compute-0 sudo[71124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:13 compute-0 python3.9[71126]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:56:13 compute-0 sudo[71124]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:13 compute-0 sudo[71247]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktvthixwgbgvoplhqepdpjmurzydnlwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949772.6024625-775-64848924783671/AnsiballZ_copy.py'
Oct 08 18:56:13 compute-0 sudo[71247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:13 compute-0 python3.9[71249]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949772.6024625-775-64848924783671/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:56:13 compute-0 sudo[71247]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:14 compute-0 sudo[71399]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scsjudefphvmtgoxgqwosaybspgsrcwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949773.9466448-775-174633968379587/AnsiballZ_stat.py'
Oct 08 18:56:14 compute-0 sudo[71399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:14 compute-0 python3.9[71401]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:56:14 compute-0 sudo[71399]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:14 compute-0 podman[71402]: 2025-10-08 18:56:14.66658234 +0000 UTC m=+0.076192063 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent)
Oct 08 18:56:15 compute-0 sudo[71541]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agbsjfcmnyfgahfccsfrmmgilncsymzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949773.9466448-775-174633968379587/AnsiballZ_copy.py'
Oct 08 18:56:15 compute-0 sudo[71541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:15 compute-0 python3.9[71543]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949773.9466448-775-174633968379587/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:56:15 compute-0 sudo[71541]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:15 compute-0 podman[71620]: 2025-10-08 18:56:15.663475415 +0000 UTC m=+0.086147839 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 08 18:56:15 compute-0 sudo[71719]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cazmhubjuweaufgphwdxcxddqmfjalil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949775.455475-775-149693139539499/AnsiballZ_stat.py'
Oct 08 18:56:15 compute-0 sudo[71719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:15 compute-0 python3.9[71721]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:56:15 compute-0 sudo[71719]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:16 compute-0 sudo[71842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxtbvkbbdrfjndhwdqdjoihyvzaezvvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949775.455475-775-149693139539499/AnsiballZ_copy.py'
Oct 08 18:56:16 compute-0 sudo[71842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:16 compute-0 python3.9[71844]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949775.455475-775-149693139539499/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:56:16 compute-0 sudo[71842]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:17 compute-0 sudo[71994]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phvgxfsovpiwlhygtxzrimgqtmzdlyux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949776.8199081-775-186647715145854/AnsiballZ_stat.py'
Oct 08 18:56:17 compute-0 sudo[71994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:17 compute-0 python3.9[71996]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:56:17 compute-0 sudo[71994]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:17 compute-0 sudo[72117]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwxxrnzfrplfatpctmgomrdwdyimvmmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949776.8199081-775-186647715145854/AnsiballZ_copy.py'
Oct 08 18:56:17 compute-0 sudo[72117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:18 compute-0 python3.9[72119]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949776.8199081-775-186647715145854/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:56:18 compute-0 sudo[72117]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:18 compute-0 sudo[72269]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mimdakrzzxgufshutwjcgcpqjtendttd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949778.3340044-775-33193879510605/AnsiballZ_stat.py'
Oct 08 18:56:18 compute-0 sudo[72269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:18 compute-0 python3.9[72271]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:56:18 compute-0 sudo[72269]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:19 compute-0 sudo[72392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjxtusabqrewmwutadeeylnmtayvwrdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949778.3340044-775-33193879510605/AnsiballZ_copy.py'
Oct 08 18:56:19 compute-0 sudo[72392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:19 compute-0 python3.9[72394]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949778.3340044-775-33193879510605/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:56:19 compute-0 sudo[72392]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:20 compute-0 sudo[72544]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-koadpjjdivnlgomfgcwyjqtfzaitkxdu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949779.8889744-775-38153509636633/AnsiballZ_stat.py'
Oct 08 18:56:20 compute-0 sudo[72544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:20 compute-0 python3.9[72546]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:56:20 compute-0 sudo[72544]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:20 compute-0 sudo[72667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfziwqgjruyjsjtdmpgnhjkhlhlbnhyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949779.8889744-775-38153509636633/AnsiballZ_copy.py'
Oct 08 18:56:20 compute-0 sudo[72667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:21 compute-0 python3.9[72669]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949779.8889744-775-38153509636633/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:56:21 compute-0 sudo[72667]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:21 compute-0 sudo[72819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvomctylockcisjtahiqmhclcbqaqvll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949781.3838897-775-120043933417754/AnsiballZ_stat.py'
Oct 08 18:56:21 compute-0 sudo[72819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:21 compute-0 python3.9[72821]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:56:22 compute-0 sudo[72819]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:22 compute-0 sudo[72942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-buzyimannpjrnddmdiadhoescnbmwrnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949781.3838897-775-120043933417754/AnsiballZ_copy.py'
Oct 08 18:56:22 compute-0 sudo[72942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:22 compute-0 python3.9[72944]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949781.3838897-775-120043933417754/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:56:22 compute-0 sudo[72942]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:23 compute-0 sudo[73094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uodebushdwopbiatyxrfsompzitauulv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949782.798273-775-250926935508102/AnsiballZ_stat.py'
Oct 08 18:56:23 compute-0 sudo[73094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:23 compute-0 python3.9[73096]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:56:23 compute-0 sudo[73094]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:23 compute-0 sudo[73217]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwpwwigmztivaivjhjnagxjdjfeyeins ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949782.798273-775-250926935508102/AnsiballZ_copy.py'
Oct 08 18:56:23 compute-0 sudo[73217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:24 compute-0 python3.9[73219]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949782.798273-775-250926935508102/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:56:24 compute-0 sudo[73217]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:24 compute-0 sudo[73369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eojgiwkgmpssjawqqbpipgnwlipcooah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949784.2555404-775-67613677167536/AnsiballZ_stat.py'
Oct 08 18:56:24 compute-0 sudo[73369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:24 compute-0 python3.9[73371]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:56:24 compute-0 sudo[73369]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:25 compute-0 sudo[73492]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdaqehictihkiafofviwsoahixqwaipj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949784.2555404-775-67613677167536/AnsiballZ_copy.py'
Oct 08 18:56:25 compute-0 sudo[73492]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:25 compute-0 python3.9[73494]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949784.2555404-775-67613677167536/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:56:25 compute-0 sudo[73492]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:25 compute-0 sudo[73644]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnonmzekwsjiwtfmbncvfdssbbiuuqaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949785.619427-775-141969523735536/AnsiballZ_stat.py'
Oct 08 18:56:25 compute-0 sudo[73644]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:26 compute-0 python3.9[73646]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:56:26 compute-0 sudo[73644]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:26 compute-0 sudo[73767]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bykkqvzwlfbjbhjuijukagxrrwbmojdm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949785.619427-775-141969523735536/AnsiballZ_copy.py'
Oct 08 18:56:26 compute-0 sudo[73767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:26 compute-0 python3.9[73769]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949785.619427-775-141969523735536/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:56:26 compute-0 sudo[73767]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:27 compute-0 sudo[73919]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjwctwhxrveybreytfkonjnekknbgfgy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949787.020448-775-17254962897182/AnsiballZ_stat.py'
Oct 08 18:56:27 compute-0 sudo[73919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:27 compute-0 python3.9[73921]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:56:27 compute-0 sudo[73919]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:28 compute-0 sudo[74042]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lckdrobnhqdtbiurilgybvwxvjzrgxqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949787.020448-775-17254962897182/AnsiballZ_copy.py'
Oct 08 18:56:28 compute-0 sudo[74042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:28 compute-0 python3.9[74044]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949787.020448-775-17254962897182/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:56:28 compute-0 sudo[74042]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:28 compute-0 sudo[74194]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdrutmaxxdxodisesmkkeejxvhuckpcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949788.434147-775-46627806559662/AnsiballZ_stat.py'
Oct 08 18:56:28 compute-0 sudo[74194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:28 compute-0 python3.9[74196]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:56:28 compute-0 sudo[74194]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:29 compute-0 sudo[74317]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msliuyqgtvajzjbekmtiowjfoqzfwewt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949788.434147-775-46627806559662/AnsiballZ_copy.py'
Oct 08 18:56:29 compute-0 sudo[74317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:29 compute-0 python3.9[74319]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949788.434147-775-46627806559662/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:56:29 compute-0 sudo[74317]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:30 compute-0 sudo[74469]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amdutpfksdlisnqxpnjnfpzsociijbkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949789.8767738-775-227109886021303/AnsiballZ_stat.py'
Oct 08 18:56:30 compute-0 sudo[74469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:30 compute-0 python3.9[74471]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:56:30 compute-0 sudo[74469]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:30 compute-0 sudo[74592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aezhjhlxveydlznfwswpzwffpowzrewr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949789.8767738-775-227109886021303/AnsiballZ_copy.py'
Oct 08 18:56:30 compute-0 sudo[74592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:31 compute-0 python3.9[74594]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949789.8767738-775-227109886021303/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:56:31 compute-0 sudo[74592]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:31 compute-0 sudo[74744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mntiatyamvkqvhbngjxgteiixecfyyrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949791.1916087-775-170584705252857/AnsiballZ_stat.py'
Oct 08 18:56:31 compute-0 sudo[74744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:31 compute-0 python3.9[74746]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:56:31 compute-0 sudo[74744]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:32 compute-0 sudo[74867]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emvbebguvmlzphxhzrbrrmbqtayiigrq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949791.1916087-775-170584705252857/AnsiballZ_copy.py'
Oct 08 18:56:32 compute-0 sudo[74867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:32 compute-0 python3.9[74869]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949791.1916087-775-170584705252857/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:56:32 compute-0 sudo[74867]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:33 compute-0 python3.9[75019]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                            ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 08 18:56:33 compute-0 sudo[75172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ecmcxeuczlxwyvkhrnwtlwurdkzurkay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949793.3319304-981-58658397641435/AnsiballZ_seboolean.py'
Oct 08 18:56:33 compute-0 sudo[75172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:34 compute-0 python3.9[75174]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Oct 08 18:56:35 compute-0 sudo[75172]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:35 compute-0 sudo[75331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvgrohtotxhbnbvbvygagvhopawavlhv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949795.4857802-989-33180663016127/AnsiballZ_copy.py'
Oct 08 18:56:35 compute-0 dbus-broker-launch[836]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Oct 08 18:56:35 compute-0 sudo[75331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:36 compute-0 python3.9[75333]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:56:36 compute-0 sudo[75331]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:36 compute-0 sudo[75483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utbbfbbxonvgsqaorpcaukimenceexbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949796.2308073-989-142149691972636/AnsiballZ_copy.py'
Oct 08 18:56:36 compute-0 sudo[75483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:36 compute-0 python3.9[75485]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:56:36 compute-0 sudo[75483]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:37 compute-0 sudo[75635]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkvhjelulscovnwhfbsribhvwhgkenzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949796.9810104-989-232492946308513/AnsiballZ_copy.py'
Oct 08 18:56:37 compute-0 sudo[75635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:37 compute-0 python3.9[75637]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:56:37 compute-0 sudo[75635]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:38 compute-0 sudo[75787]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrpcltlctphbqmapsayvfsnzypjahaoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949797.7051876-989-13680884682965/AnsiballZ_copy.py'
Oct 08 18:56:38 compute-0 sudo[75787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:38 compute-0 python3.9[75789]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:56:38 compute-0 sudo[75787]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:38 compute-0 sudo[75939]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwkatgznisvlyfebugislcnzrxukslvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949798.5892262-989-194998868704863/AnsiballZ_copy.py'
Oct 08 18:56:38 compute-0 sudo[75939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:39 compute-0 python3.9[75941]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:56:39 compute-0 sudo[75939]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:39 compute-0 sudo[76091]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omyocisibctjyzakpdewcwhvglglaqji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949799.3439057-1025-79446847248102/AnsiballZ_copy.py'
Oct 08 18:56:39 compute-0 sudo[76091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:39 compute-0 python3.9[76093]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:56:39 compute-0 sudo[76091]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:40 compute-0 sudo[76243]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yakkhohmoqwohqwxhixxqdfowhuvuvsk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949800.0247502-1025-21275455091447/AnsiballZ_copy.py'
Oct 08 18:56:40 compute-0 sudo[76243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:40 compute-0 python3.9[76245]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:56:40 compute-0 sudo[76243]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:41 compute-0 sudo[76395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aexgdgijkuqaftbjiulifyyiwjixzqyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949800.7078848-1025-114113237397363/AnsiballZ_copy.py'
Oct 08 18:56:41 compute-0 sudo[76395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:41 compute-0 python3.9[76397]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:56:41 compute-0 sudo[76395]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:41 compute-0 sudo[76547]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtuvlhmjffzfbackbbbnjizjpndbmvev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949801.4493537-1025-202338524283413/AnsiballZ_copy.py'
Oct 08 18:56:41 compute-0 sudo[76547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:42 compute-0 python3.9[76549]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:56:42 compute-0 sudo[76547]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:42 compute-0 sudo[76699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xppujuvhxobsjawfdhnetujdgvonexcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949802.1701384-1025-235565792725839/AnsiballZ_copy.py'
Oct 08 18:56:42 compute-0 sudo[76699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:42 compute-0 python3.9[76701]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:56:42 compute-0 sudo[76699]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:43 compute-0 sudo[76851]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wszhombrnazpyldkyxodaclqyxmqhdoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949802.8940892-1061-38009118292073/AnsiballZ_systemd.py'
Oct 08 18:56:43 compute-0 sudo[76851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:43 compute-0 python3.9[76853]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 08 18:56:43 compute-0 systemd[1]: Reloading.
Oct 08 18:56:43 compute-0 systemd-rc-local-generator[76879]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 18:56:43 compute-0 systemd-sysv-generator[76884]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 18:56:43 compute-0 systemd[1]: Starting libvirt logging daemon socket...
Oct 08 18:56:43 compute-0 systemd[1]: Listening on libvirt logging daemon socket.
Oct 08 18:56:43 compute-0 systemd[1]: Starting libvirt logging daemon admin socket...
Oct 08 18:56:43 compute-0 systemd[1]: Listening on libvirt logging daemon admin socket.
Oct 08 18:56:43 compute-0 systemd[1]: Starting libvirt logging daemon...
Oct 08 18:56:44 compute-0 systemd[1]: Started libvirt logging daemon.
Oct 08 18:56:44 compute-0 sudo[76851]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:56:44.214 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 18:56:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:56:44.215 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 18:56:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:56:44.215 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 18:56:44 compute-0 sudo[77044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjaumnwytywmhssclwjklfabnandwwez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949804.2773173-1061-142253182379643/AnsiballZ_systemd.py'
Oct 08 18:56:44 compute-0 sudo[77044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:44 compute-0 python3.9[77046]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 08 18:56:44 compute-0 systemd[1]: Reloading.
Oct 08 18:56:45 compute-0 systemd-rc-local-generator[77094]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 18:56:45 compute-0 systemd-sysv-generator[77098]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 18:56:45 compute-0 podman[77048]: 2025-10-08 18:56:45.086525277 +0000 UTC m=+0.110779895 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct 08 18:56:45 compute-0 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Oct 08 18:56:45 compute-0 systemd[1]: Starting libvirt nodedev daemon socket...
Oct 08 18:56:45 compute-0 systemd[1]: Listening on libvirt nodedev daemon socket.
Oct 08 18:56:45 compute-0 systemd[1]: Starting libvirt nodedev daemon admin socket...
Oct 08 18:56:45 compute-0 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Oct 08 18:56:45 compute-0 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Oct 08 18:56:45 compute-0 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Oct 08 18:56:45 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Oct 08 18:56:45 compute-0 systemd[1]: Started libvirt nodedev daemon.
Oct 08 18:56:45 compute-0 sudo[77044]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:45 compute-0 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Oct 08 18:56:45 compute-0 sudo[77303]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-larltbotjzidgcuoyrgfkfihrrinxqwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949805.5171587-1061-172748775822660/AnsiballZ_systemd.py'
Oct 08 18:56:45 compute-0 sudo[77303]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:45 compute-0 podman[77230]: 2025-10-08 18:56:45.893784758 +0000 UTC m=+0.141313173 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001)
Oct 08 18:56:45 compute-0 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Oct 08 18:56:45 compute-0 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Oct 08 18:56:46 compute-0 python3.9[77310]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 08 18:56:46 compute-0 systemd[1]: Reloading.
Oct 08 18:56:46 compute-0 systemd-rc-local-generator[77345]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 18:56:46 compute-0 systemd-sysv-generator[77348]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 18:56:46 compute-0 systemd[1]: Starting libvirt proxy daemon admin socket...
Oct 08 18:56:46 compute-0 systemd[1]: Starting libvirt proxy daemon read-only socket...
Oct 08 18:56:46 compute-0 systemd[1]: Listening on libvirt proxy daemon admin socket.
Oct 08 18:56:46 compute-0 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Oct 08 18:56:46 compute-0 systemd[1]: Starting libvirt proxy daemon...
Oct 08 18:56:46 compute-0 systemd[1]: Started libvirt proxy daemon.
Oct 08 18:56:46 compute-0 sudo[77303]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:46 compute-0 setroubleshoot[77102]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l be07387d-8aeb-4798-8ef4-711c36244e22
Oct 08 18:56:46 compute-0 setroubleshoot[77102]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                 
                                                 *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                 
                                                 If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                 Then turn on full auditing to get path information about the offending file and generate the error again.
                                                 Do
                                                 
                                                 Turn on full auditing
                                                 # auditctl -w /etc/shadow -p w
                                                 Try to recreate AVC. Then execute
                                                 # ausearch -m avc -ts recent
                                                 If you see PATH record check ownership/permissions on file, and fix it,
                                                 otherwise report as a bugzilla.
                                                 
                                                 *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                 
                                                 If you believe that virtlogd should have the dac_read_search capability by default.
                                                 Then you should report this as a bug.
                                                 You can generate a local policy module to allow this access.
                                                 Do
                                                 allow this access for now by executing:
                                                 # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                 # semodule -X 300 -i my-virtlogd.pp
                                                 
Oct 08 18:56:46 compute-0 setroubleshoot[77102]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l be07387d-8aeb-4798-8ef4-711c36244e22
Oct 08 18:56:46 compute-0 setroubleshoot[77102]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                 
                                                 *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                 
                                                 If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                 Then turn on full auditing to get path information about the offending file and generate the error again.
                                                 Do
                                                 
                                                 Turn on full auditing
                                                 # auditctl -w /etc/shadow -p w
                                                 Try to recreate AVC. Then execute
                                                 # ausearch -m avc -ts recent
                                                 If you see PATH record check ownership/permissions on file, and fix it,
                                                 otherwise report as a bugzilla.
                                                 
                                                 *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                 
                                                 If you believe that virtlogd should have the dac_read_search capability by default.
                                                 Then you should report this as a bug.
                                                 You can generate a local policy module to allow this access.
                                                 Do
                                                 allow this access for now by executing:
                                                 # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                 # semodule -X 300 -i my-virtlogd.pp
                                                 
Oct 08 18:56:47 compute-0 sudo[77525]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lavbtuapygsyhotkjkbeozisilbboxrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949806.8076334-1061-217311324167743/AnsiballZ_systemd.py'
Oct 08 18:56:47 compute-0 sudo[77525]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:47 compute-0 python3.9[77527]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 08 18:56:47 compute-0 systemd[1]: Reloading.
Oct 08 18:56:47 compute-0 systemd-sysv-generator[77560]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 18:56:47 compute-0 systemd-rc-local-generator[77555]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 18:56:47 compute-0 systemd[1]: Listening on libvirt locking daemon socket.
Oct 08 18:56:47 compute-0 systemd[1]: Starting libvirt QEMU daemon socket...
Oct 08 18:56:47 compute-0 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Oct 08 18:56:47 compute-0 systemd[1]: Starting Virtual Machine and Container Registration Service...
Oct 08 18:56:47 compute-0 systemd[1]: Listening on libvirt QEMU daemon socket.
Oct 08 18:56:47 compute-0 systemd[1]: Starting libvirt QEMU daemon admin socket...
Oct 08 18:56:47 compute-0 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Oct 08 18:56:47 compute-0 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Oct 08 18:56:47 compute-0 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Oct 08 18:56:47 compute-0 systemd[1]: Started Virtual Machine and Container Registration Service.
Oct 08 18:56:47 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Oct 08 18:56:47 compute-0 systemd[1]: Started libvirt QEMU daemon.
Oct 08 18:56:47 compute-0 sudo[77525]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:48 compute-0 sudo[77739]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nolvqhjrxcmdmdemyikkrarabrqebade ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949808.1155334-1061-645117351160/AnsiballZ_systemd.py'
Oct 08 18:56:48 compute-0 sudo[77739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:48 compute-0 python3.9[77741]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 08 18:56:48 compute-0 systemd[1]: Reloading.
Oct 08 18:56:48 compute-0 systemd-rc-local-generator[77769]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 18:56:48 compute-0 systemd-sysv-generator[77773]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 18:56:49 compute-0 systemd[1]: Starting libvirt secret daemon socket...
Oct 08 18:56:49 compute-0 systemd[1]: Listening on libvirt secret daemon socket.
Oct 08 18:56:49 compute-0 systemd[1]: Starting libvirt secret daemon admin socket...
Oct 08 18:56:49 compute-0 systemd[1]: Starting libvirt secret daemon read-only socket...
Oct 08 18:56:49 compute-0 systemd[1]: Listening on libvirt secret daemon admin socket.
Oct 08 18:56:49 compute-0 systemd[1]: Listening on libvirt secret daemon read-only socket.
Oct 08 18:56:49 compute-0 systemd[1]: Starting libvirt secret daemon...
Oct 08 18:56:49 compute-0 systemd[1]: Started libvirt secret daemon.
Oct 08 18:56:49 compute-0 sudo[77739]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:49 compute-0 sudo[77950]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxjrssowwwongfdgdjetqanoefgxfbcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949809.5152197-1098-216058115250498/AnsiballZ_file.py'
Oct 08 18:56:49 compute-0 sudo[77950]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:50 compute-0 python3.9[77952]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:56:50 compute-0 sudo[77950]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:50 compute-0 sudo[78102]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvlltjngqpkkdbtctehkpwytuzcjvfrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949810.2692413-1106-116410057167521/AnsiballZ_find.py'
Oct 08 18:56:50 compute-0 sudo[78102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:50 compute-0 python3.9[78104]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct 08 18:56:50 compute-0 sudo[78102]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:51 compute-0 sudo[78254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-othdjrwskwspccucxktjsqllrkypzype ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949811.278163-1120-120463055275706/AnsiballZ_stat.py'
Oct 08 18:56:51 compute-0 sudo[78254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:51 compute-0 python3.9[78256]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:56:51 compute-0 rsyslogd[1288]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 08 18:56:51 compute-0 rsyslogd[1288]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 08 18:56:51 compute-0 sudo[78254]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:52 compute-0 sudo[78378]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jomrlpoqgchjqjwbwamcnfpaeuomqexj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949811.278163-1120-120463055275706/AnsiballZ_copy.py'
Oct 08 18:56:52 compute-0 sudo[78378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:52 compute-0 python3.9[78380]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1759949811.278163-1120-120463055275706/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:56:52 compute-0 sudo[78378]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:53 compute-0 sudo[78530]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofidepvbtldqmiskajhennkklpmmipaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949812.7757306-1136-51161601872889/AnsiballZ_file.py'
Oct 08 18:56:53 compute-0 sudo[78530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:53 compute-0 python3.9[78532]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:56:53 compute-0 sudo[78530]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:53 compute-0 sudo[78682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkwsuoemccojzgwqwohwarlsweofcvrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949813.537368-1144-242611244558089/AnsiballZ_stat.py'
Oct 08 18:56:53 compute-0 sudo[78682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:54 compute-0 python3.9[78684]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:56:54 compute-0 sudo[78682]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:54 compute-0 sudo[78760]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwjezcwimxxrbjdfjowcdbdkrwmwoymg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949813.537368-1144-242611244558089/AnsiballZ_file.py'
Oct 08 18:56:54 compute-0 sudo[78760]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:54 compute-0 python3.9[78762]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:56:54 compute-0 sudo[78760]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:55 compute-0 sudo[78912]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkocxzdjzeapdhjjcdcbykntwnevymrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949814.8341818-1156-25484359898664/AnsiballZ_stat.py'
Oct 08 18:56:55 compute-0 sudo[78912]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:55 compute-0 python3.9[78914]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:56:55 compute-0 sudo[78912]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:55 compute-0 sudo[78990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iyymamklpzrfedrilnktpnovskbsnhzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949814.8341818-1156-25484359898664/AnsiballZ_file.py'
Oct 08 18:56:55 compute-0 sudo[78990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:56 compute-0 python3.9[78992]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.eli36ciz recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:56:56 compute-0 sudo[78990]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:56 compute-0 sudo[79142]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nashafshlvamobskgdpltudqhvrahdiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949816.2324803-1168-25386019869986/AnsiballZ_stat.py'
Oct 08 18:56:56 compute-0 sudo[79142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:56 compute-0 python3.9[79144]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:56:56 compute-0 sudo[79142]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:56 compute-0 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Oct 08 18:56:57 compute-0 systemd[1]: setroubleshootd.service: Deactivated successfully.
Oct 08 18:56:57 compute-0 sudo[79220]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lpgdtxvaonznpipyqmskoklvketyqqsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949816.2324803-1168-25386019869986/AnsiballZ_file.py'
Oct 08 18:56:57 compute-0 sudo[79220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:57 compute-0 python3.9[79223]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:56:57 compute-0 sudo[79220]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:57 compute-0 sudo[79373]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owlcfdixawaijiufodezkvsmexffnszf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949817.4721842-1181-117652183036451/AnsiballZ_command.py'
Oct 08 18:56:57 compute-0 sudo[79373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:58 compute-0 python3.9[79375]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 08 18:56:58 compute-0 sudo[79373]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:58 compute-0 sudo[79526]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttgiriwgyaxeqebqracggradhrkxyitr ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759949818.218067-1189-84807683713599/AnsiballZ_edpm_nftables_from_files.py'
Oct 08 18:56:58 compute-0 sudo[79526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:58 compute-0 python3[79528]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct 08 18:56:59 compute-0 sudo[79526]: pam_unix(sudo:session): session closed for user root
Oct 08 18:56:59 compute-0 sudo[79678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grmltmgwtitcqohtgjizneejpuykvyxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949819.1781921-1197-203393099431775/AnsiballZ_stat.py'
Oct 08 18:56:59 compute-0 sudo[79678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:56:59 compute-0 python3.9[79680]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:56:59 compute-0 sudo[79678]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:00 compute-0 sudo[79756]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-buefefqbiyryetchhefeuurbqorgfzcl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949819.1781921-1197-203393099431775/AnsiballZ_file.py'
Oct 08 18:57:00 compute-0 sudo[79756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:00 compute-0 python3.9[79758]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:57:00 compute-0 sudo[79756]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:01 compute-0 sudo[79908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajplnnfjlrmdmgyrpdqotifceikevguz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949820.679322-1209-39217986266147/AnsiballZ_stat.py'
Oct 08 18:57:01 compute-0 sudo[79908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:01 compute-0 python3.9[79910]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:57:01 compute-0 sudo[79908]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:01 compute-0 sudo[79986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-flvsrhqnakusoquukcxomxmkxahmjiyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949820.679322-1209-39217986266147/AnsiballZ_file.py'
Oct 08 18:57:01 compute-0 sudo[79986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:01 compute-0 python3.9[79988]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:57:01 compute-0 sudo[79986]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:02 compute-0 sudo[80138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-goxdzyisktphstdpxdyapmlarvzhxplm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949822.0220459-1221-15432492102752/AnsiballZ_stat.py'
Oct 08 18:57:02 compute-0 sudo[80138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:02 compute-0 python3.9[80140]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:57:02 compute-0 sudo[80138]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:02 compute-0 sudo[80216]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqxpfatfnyrmnntgeciguapthjjbwesc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949822.0220459-1221-15432492102752/AnsiballZ_file.py'
Oct 08 18:57:02 compute-0 sudo[80216]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:03 compute-0 python3.9[80218]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:57:03 compute-0 sudo[80216]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:03 compute-0 sudo[80368]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxchptdjhghkfemmtrysezxypkvwqbqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949823.2858055-1233-57305008804622/AnsiballZ_stat.py'
Oct 08 18:57:03 compute-0 sudo[80368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:03 compute-0 python3.9[80370]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:57:03 compute-0 sudo[80368]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:04 compute-0 sudo[80446]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dcxiilwkvepcerbpokvwgbjpznbjxllz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949823.2858055-1233-57305008804622/AnsiballZ_file.py'
Oct 08 18:57:04 compute-0 sudo[80446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:04 compute-0 python3.9[80448]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:57:04 compute-0 sudo[80446]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:05 compute-0 sudo[80598]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-petkjphbhhbgdytdcwdfibdcnfshdpqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949824.6477885-1245-85778835292306/AnsiballZ_stat.py'
Oct 08 18:57:05 compute-0 sudo[80598]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:05 compute-0 python3.9[80600]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:57:05 compute-0 sudo[80598]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:05 compute-0 sudo[80723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nspmtogvywnitenmomkhsferkmijalcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949824.6477885-1245-85778835292306/AnsiballZ_copy.py'
Oct 08 18:57:05 compute-0 sudo[80723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:05 compute-0 python3.9[80725]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949824.6477885-1245-85778835292306/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:57:05 compute-0 sudo[80723]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:06 compute-0 sudo[80875]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nsecoedmllopnrwjoimxyegjzujipjrd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949826.2235444-1260-188833851002078/AnsiballZ_file.py'
Oct 08 18:57:06 compute-0 sudo[80875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:06 compute-0 python3.9[80877]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:57:06 compute-0 sudo[80875]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:07 compute-0 sudo[81027]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kavqkdpomudjsojyijdjkwaatdovqnzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949827.0194342-1268-213422038023069/AnsiballZ_command.py'
Oct 08 18:57:07 compute-0 sudo[81027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:07 compute-0 python3.9[81029]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 08 18:57:07 compute-0 sudo[81027]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:08 compute-0 sudo[81182]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kteqfvjuhupmzqrugbfkwayqdbfjdaqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949827.8214455-1276-135705766686094/AnsiballZ_blockinfile.py'
Oct 08 18:57:08 compute-0 sudo[81182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:08 compute-0 python3.9[81184]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:57:08 compute-0 sudo[81182]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:09 compute-0 sudo[81334]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yejdukpvpuylrzldzkqsmveahwhgwsjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949828.8135896-1285-161229366697858/AnsiballZ_command.py'
Oct 08 18:57:09 compute-0 sudo[81334]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:09 compute-0 python3.9[81336]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 08 18:57:09 compute-0 sudo[81334]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:10 compute-0 sudo[81487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blfnfzscjjunphmsoreogzxnalgxkupx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949829.6823726-1293-213437015628604/AnsiballZ_stat.py'
Oct 08 18:57:10 compute-0 sudo[81487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:10 compute-0 python3.9[81489]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 08 18:57:10 compute-0 sudo[81487]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:10 compute-0 sudo[81641]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dknpbjhijjknpzgiyxrqtjifniecmkzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949830.499919-1301-280286736867593/AnsiballZ_command.py'
Oct 08 18:57:10 compute-0 sudo[81641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:11 compute-0 python3.9[81643]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 08 18:57:11 compute-0 sudo[81641]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:11 compute-0 sudo[81796]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fciferqeowpgsucujiibfgwyxjebulau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949831.2642307-1309-190366988672787/AnsiballZ_file.py'
Oct 08 18:57:11 compute-0 sudo[81796]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:11 compute-0 python3.9[81798]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:57:11 compute-0 sudo[81796]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:12 compute-0 sudo[81948]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrcukblhqgrycgldkqnjhknqoxgzxsvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949832.035195-1317-144450595839219/AnsiballZ_stat.py'
Oct 08 18:57:12 compute-0 sudo[81948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:12 compute-0 python3.9[81950]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:57:12 compute-0 sudo[81948]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:12 compute-0 sudo[82071]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojlvwzulhzsburhdyuddpoxwmobfihed ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949832.035195-1317-144450595839219/AnsiballZ_copy.py'
Oct 08 18:57:12 compute-0 sudo[82071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:13 compute-0 python3.9[82073]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759949832.035195-1317-144450595839219/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:57:13 compute-0 sudo[82071]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:13 compute-0 sudo[82223]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvbaqvnjpgldxrseivlnphjajlixbmvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949833.3957467-1332-229620971419936/AnsiballZ_stat.py'
Oct 08 18:57:13 compute-0 sudo[82223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:13 compute-0 python3.9[82225]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:57:13 compute-0 sudo[82223]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:14 compute-0 sudo[82346]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgjkzhrhapujyiihnbvbhfhvtvysltxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949833.3957467-1332-229620971419936/AnsiballZ_copy.py'
Oct 08 18:57:14 compute-0 sudo[82346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:14 compute-0 python3.9[82348]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759949833.3957467-1332-229620971419936/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:57:14 compute-0 sudo[82346]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:15 compute-0 sudo[82498]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hydyyhomwohgywpjmiznxiumckmdetgp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949834.8769894-1347-261928793071585/AnsiballZ_stat.py'
Oct 08 18:57:15 compute-0 sudo[82498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:15 compute-0 python3.9[82500]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:57:15 compute-0 sudo[82498]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:15 compute-0 podman[82530]: 2025-10-08 18:57:15.666847851 +0000 UTC m=+0.075584825 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct 08 18:57:15 compute-0 sudo[82641]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eblphvlkfeyzuzhyoupzvfitkzwmivvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949834.8769894-1347-261928793071585/AnsiballZ_copy.py'
Oct 08 18:57:15 compute-0 sudo[82641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:16 compute-0 python3.9[82643]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759949834.8769894-1347-261928793071585/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:57:16 compute-0 sudo[82641]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:16 compute-0 podman[82746]: 2025-10-08 18:57:16.69790287 +0000 UTC m=+0.113695311 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller)
Oct 08 18:57:16 compute-0 sudo[82819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dydjbrrywsxpwwigvkmoxgvomxlfgffs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949836.30302-1362-23176078153848/AnsiballZ_systemd.py'
Oct 08 18:57:16 compute-0 sudo[82819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:17 compute-0 python3.9[82821]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 08 18:57:17 compute-0 systemd[1]: Reloading.
Oct 08 18:57:17 compute-0 systemd-rc-local-generator[82850]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 18:57:17 compute-0 systemd-sysv-generator[82854]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 18:57:17 compute-0 systemd[1]: Reached target edpm_libvirt.target.
Oct 08 18:57:17 compute-0 sudo[82819]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:17 compute-0 sudo[83011]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgmbgcvftagojsrfwiokhjgwauditywc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949837.5335329-1370-202903030710839/AnsiballZ_systemd.py'
Oct 08 18:57:17 compute-0 sudo[83011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:18 compute-0 python3.9[83013]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Oct 08 18:57:18 compute-0 systemd[1]: Reloading.
Oct 08 18:57:18 compute-0 systemd-rc-local-generator[83036]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 18:57:18 compute-0 systemd-sysv-generator[83041]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 18:57:18 compute-0 systemd[1]: Reloading.
Oct 08 18:57:18 compute-0 systemd-rc-local-generator[83072]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 18:57:18 compute-0 systemd-sysv-generator[83078]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 18:57:18 compute-0 sudo[83011]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:19 compute-0 sshd-session[28791]: Connection closed by 192.168.122.30 port 36532
Oct 08 18:57:19 compute-0 sshd-session[28788]: pam_unix(sshd:session): session closed for user zuul
Oct 08 18:57:19 compute-0 systemd[1]: session-7.scope: Deactivated successfully.
Oct 08 18:57:19 compute-0 systemd[1]: session-7.scope: Consumed 3min 50.466s CPU time.
Oct 08 18:57:19 compute-0 systemd-logind[844]: Session 7 logged out. Waiting for processes to exit.
Oct 08 18:57:19 compute-0 systemd-logind[844]: Removed session 7.
Oct 08 18:57:24 compute-0 sshd-session[83109]: Accepted publickey for zuul from 192.168.122.30 port 37192 ssh2: ECDSA SHA256:i+73Mx2Y/ukt1b+huf+9w+ftZalnyybbDU6glTR0JfU
Oct 08 18:57:24 compute-0 systemd-logind[844]: New session 8 of user zuul.
Oct 08 18:57:24 compute-0 systemd[1]: Started Session 8 of User zuul.
Oct 08 18:57:24 compute-0 sshd-session[83109]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 08 18:57:25 compute-0 python3.9[83262]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 08 18:57:26 compute-0 sudo[83416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zibspsdrnmsvvhbjvvbjzahcrdmgfcvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949846.3371012-34-175233465568835/AnsiballZ_file.py'
Oct 08 18:57:26 compute-0 sudo[83416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:27 compute-0 python3.9[83418]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:57:27 compute-0 sudo[83416]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:27 compute-0 sudo[83568]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzjvkjwszblkorrtzsdcvyapgjderuex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949847.3056746-34-143650721777343/AnsiballZ_file.py'
Oct 08 18:57:27 compute-0 sudo[83568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:27 compute-0 python3.9[83570]: ansible-ansible.builtin.file Invoked with path=/etc/target setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:57:27 compute-0 sudo[83568]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:28 compute-0 sudo[83720]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekwvkjjptwoqggbdorfymjktvxmxfwyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949848.114419-34-148514938963866/AnsiballZ_file.py'
Oct 08 18:57:28 compute-0 sudo[83720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:28 compute-0 python3.9[83722]: ansible-ansible.builtin.file Invoked with path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:57:28 compute-0 sudo[83720]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:29 compute-0 sudo[83872]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gantionrxatfublmvjgaxkvvbssqrwsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949848.872173-34-100066281835521/AnsiballZ_file.py'
Oct 08 18:57:29 compute-0 sudo[83872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:29 compute-0 python3.9[83874]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/config-data selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct 08 18:57:29 compute-0 sudo[83872]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:30 compute-0 sudo[84024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zcfbhccbgoliadhffkzhdihmvizxejvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949849.7199323-34-41738643085064/AnsiballZ_file.py'
Oct 08 18:57:30 compute-0 sudo[84024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:30 compute-0 python3.9[84026]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/config-data/ansible-generated/iscsid setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:57:30 compute-0 sudo[84024]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:31 compute-0 sudo[84176]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lckovrlaajqhppwgvyrgmunazminlung ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949850.5171394-70-77235057881875/AnsiballZ_stat.py'
Oct 08 18:57:31 compute-0 sudo[84176]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:31 compute-0 python3.9[84178]: ansible-ansible.builtin.stat Invoked with path=/lib/systemd/system/iscsid.socket follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 08 18:57:31 compute-0 sudo[84176]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:32 compute-0 sudo[84330]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iebjfcznncbzmhsvqzrjeemptgmnleor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949851.4646297-78-265534299593085/AnsiballZ_systemd.py'
Oct 08 18:57:32 compute-0 sudo[84330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:32 compute-0 python3.9[84332]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iscsid.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 08 18:57:32 compute-0 systemd[1]: Reloading.
Oct 08 18:57:32 compute-0 systemd-rc-local-generator[84360]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 18:57:32 compute-0 systemd-sysv-generator[84363]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 18:57:32 compute-0 sudo[84330]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:33 compute-0 sudo[84519]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btpmfnlrpivaptxtqpzfgmuudpqmpcom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949853.0591426-86-39252177521813/AnsiballZ_service_facts.py'
Oct 08 18:57:33 compute-0 sudo[84519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:33 compute-0 python3.9[84521]: ansible-ansible.builtin.service_facts Invoked
Oct 08 18:57:33 compute-0 network[84538]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 08 18:57:33 compute-0 network[84539]: 'network-scripts' will be removed from distribution in near future.
Oct 08 18:57:33 compute-0 network[84540]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 08 18:57:37 compute-0 sudo[84519]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:38 compute-0 sudo[84811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzlzpakjxjdcgqbqmypwdurmnsdtlqui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949858.2015996-94-218645516098655/AnsiballZ_systemd.py'
Oct 08 18:57:38 compute-0 sudo[84811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:38 compute-0 python3.9[84813]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iscsi-starter.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 08 18:57:38 compute-0 systemd[1]: Reloading.
Oct 08 18:57:39 compute-0 systemd-rc-local-generator[84844]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 18:57:39 compute-0 systemd-sysv-generator[84847]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 18:57:39 compute-0 sudo[84811]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:40 compute-0 python3.9[85000]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 08 18:57:40 compute-0 sudo[85150]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewqudqipawybtkqwbypjdcehrmawszzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949860.4314737-111-207590714433281/AnsiballZ_podman_container.py'
Oct 08 18:57:40 compute-0 sudo[85150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:41 compute-0 python3.9[85152]: ansible-containers.podman.podman_container Invoked with command=/usr/sbin/iscsi-iname detach=False image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f name=iscsid_config rm=True tty=True executable=podman state=started debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Oct 08 18:57:41 compute-0 podman[85187]: 2025-10-08 18:57:41.421983955 +0000 UTC m=+0.053951591 container create bd0e98d68ba1af1ef58619b81edec497c6f60988687d5e2b593f6a5f203b944d (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid_config, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Oct 08 18:57:41 compute-0 rsyslogd[1288]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 08 18:57:41 compute-0 rsyslogd[1288]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 08 18:57:41 compute-0 NetworkManager[1035]: <info>  [1759949861.4831] manager: (podman0): new Bridge device (/org/freedesktop/NetworkManager/Devices/19)
Oct 08 18:57:41 compute-0 podman[85187]: 2025-10-08 18:57:41.401368236 +0000 UTC m=+0.033335892 image pull 74877095db294c27659f24e7f86074178a6f28eee68561c30e3ce4d18519e09c quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f
Oct 08 18:57:41 compute-0 kernel: podman0: port 1(veth0) entered blocking state
Oct 08 18:57:41 compute-0 kernel: podman0: port 1(veth0) entered disabled state
Oct 08 18:57:41 compute-0 kernel: veth0: entered allmulticast mode
Oct 08 18:57:41 compute-0 kernel: veth0: entered promiscuous mode
Oct 08 18:57:41 compute-0 kernel: podman0: port 1(veth0) entered blocking state
Oct 08 18:57:41 compute-0 kernel: podman0: port 1(veth0) entered forwarding state
Oct 08 18:57:41 compute-0 NetworkManager[1035]: <info>  [1759949861.5111] manager: (veth0): new Veth device (/org/freedesktop/NetworkManager/Devices/20)
Oct 08 18:57:41 compute-0 NetworkManager[1035]: <info>  [1759949861.5134] device (veth0): carrier: link connected
Oct 08 18:57:41 compute-0 NetworkManager[1035]: <info>  [1759949861.5138] device (podman0): carrier: link connected
Oct 08 18:57:41 compute-0 systemd-udevd[85215]: Network interface NamePolicy= disabled on kernel command line.
Oct 08 18:57:41 compute-0 systemd-udevd[85218]: Network interface NamePolicy= disabled on kernel command line.
Oct 08 18:57:41 compute-0 kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Oct 08 18:57:41 compute-0 NetworkManager[1035]: <info>  [1759949861.5569] device (podman0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 08 18:57:41 compute-0 kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Oct 08 18:57:41 compute-0 NetworkManager[1035]: <info>  [1759949861.5582] device (podman0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct 08 18:57:41 compute-0 NetworkManager[1035]: <info>  [1759949861.5597] device (podman0): Activation: starting connection 'podman0' (a1f0edc0-82df-477b-98ca-0e9087639e8e)
Oct 08 18:57:41 compute-0 NetworkManager[1035]: <info>  [1759949861.5600] device (podman0): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct 08 18:57:41 compute-0 NetworkManager[1035]: <info>  [1759949861.5603] device (podman0): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct 08 18:57:41 compute-0 NetworkManager[1035]: <info>  [1759949861.5609] device (podman0): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct 08 18:57:41 compute-0 NetworkManager[1035]: <info>  [1759949861.5613] device (podman0): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct 08 18:57:41 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct 08 18:57:41 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct 08 18:57:41 compute-0 NetworkManager[1035]: <info>  [1759949861.5877] device (podman0): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct 08 18:57:41 compute-0 NetworkManager[1035]: <info>  [1759949861.5880] device (podman0): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct 08 18:57:41 compute-0 NetworkManager[1035]: <info>  [1759949861.5888] device (podman0): Activation: successful, device activated.
Oct 08 18:57:41 compute-0 systemd[1]: iscsi.service: Unit cannot be reloaded because it is inactive.
Oct 08 18:57:41 compute-0 systemd[1]: Started libpod-conmon-bd0e98d68ba1af1ef58619b81edec497c6f60988687d5e2b593f6a5f203b944d.scope.
Oct 08 18:57:41 compute-0 systemd[1]: Started libcrun container.
Oct 08 18:57:41 compute-0 podman[85187]: 2025-10-08 18:57:41.887503298 +0000 UTC m=+0.519470954 container init bd0e98d68ba1af1ef58619b81edec497c6f60988687d5e2b593f6a5f203b944d (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid_config, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 08 18:57:41 compute-0 podman[85187]: 2025-10-08 18:57:41.89601921 +0000 UTC m=+0.527986856 container start bd0e98d68ba1af1ef58619b81edec497c6f60988687d5e2b593f6a5f203b944d (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid_config, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct 08 18:57:41 compute-0 podman[85187]: 2025-10-08 18:57:41.899350047 +0000 UTC m=+0.531317703 container attach bd0e98d68ba1af1ef58619b81edec497c6f60988687d5e2b593f6a5f203b944d (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid_config, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Oct 08 18:57:41 compute-0 iscsid_config[85351]: iqn.1994-05.com.redhat:c69c6f2d1774
Oct 08 18:57:41 compute-0 systemd[1]: libpod-bd0e98d68ba1af1ef58619b81edec497c6f60988687d5e2b593f6a5f203b944d.scope: Deactivated successfully.
Oct 08 18:57:41 compute-0 conmon[85351]: conmon bd0e98d68ba1af1ef586 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-bd0e98d68ba1af1ef58619b81edec497c6f60988687d5e2b593f6a5f203b944d.scope/container/memory.events
Oct 08 18:57:41 compute-0 podman[85187]: 2025-10-08 18:57:41.908207299 +0000 UTC m=+0.540174935 container died bd0e98d68ba1af1ef58619b81edec497c6f60988687d5e2b593f6a5f203b944d (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid_config, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 08 18:57:41 compute-0 kernel: podman0: port 1(veth0) entered disabled state
Oct 08 18:57:41 compute-0 kernel: veth0 (unregistering): left allmulticast mode
Oct 08 18:57:41 compute-0 kernel: veth0 (unregistering): left promiscuous mode
Oct 08 18:57:41 compute-0 kernel: podman0: port 1(veth0) entered disabled state
Oct 08 18:57:41 compute-0 NetworkManager[1035]: <info>  [1759949861.9807] device (podman0): state change: activated -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 08 18:57:42 compute-0 systemd[1]: run-netns-netns\x2dfc6b247d\x2deeb4\x2dfcd6\x2d87fd\x2d6cf17d892959.mount: Deactivated successfully.
Oct 08 18:57:42 compute-0 podman[85187]: 2025-10-08 18:57:42.379567873 +0000 UTC m=+1.011535509 container remove bd0e98d68ba1af1ef58619b81edec497c6f60988687d5e2b593f6a5f203b944d (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid_config, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 08 18:57:42 compute-0 python3.9[85152]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman run --name iscsid_config --detach=False --rm --tty=True quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f /usr/sbin/iscsi-iname
Oct 08 18:57:42 compute-0 systemd[1]: libpod-conmon-bd0e98d68ba1af1ef58619b81edec497c6f60988687d5e2b593f6a5f203b944d.scope: Deactivated successfully.
Oct 08 18:57:42 compute-0 systemd[1]: var-lib-containers-storage-overlay-7264b161f5c9e247605b5d6c84405c5f66ea8d06ec38ec28d9ccb5ac692914af-merged.mount: Deactivated successfully.
Oct 08 18:57:42 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bd0e98d68ba1af1ef58619b81edec497c6f60988687d5e2b593f6a5f203b944d-userdata-shm.mount: Deactivated successfully.
Oct 08 18:57:42 compute-0 python3.9[85152]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: Error generating systemd: 
                                            DEPRECATED command:
                                            It is recommended to use Quadlets for running containers and pods under systemd.
                                            
                                            Please refer to podman-systemd.unit(5) for details.
                                            Error: iscsid_config does not refer to a container or pod: no pod with name or ID iscsid_config found: no such pod: no container with name or ID "iscsid_config" found: no such container
Oct 08 18:57:42 compute-0 sudo[85150]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:43 compute-0 sudo[85590]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvyimkdxwxujlydvwmqdwdjfkdsqaejn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949862.696217-119-123593211545325/AnsiballZ_stat.py'
Oct 08 18:57:43 compute-0 sudo[85590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:43 compute-0 python3.9[85592]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:57:43 compute-0 sudo[85590]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:43 compute-0 sudo[85713]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nptvocvbmsfeltkivkekklhudxhfbwmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949862.696217-119-123593211545325/AnsiballZ_copy.py'
Oct 08 18:57:43 compute-0 sudo[85713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:44 compute-0 python3.9[85715]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759949862.696217-119-123593211545325/.source.iscsi _original_basename=.hmug41v6 follow=False checksum=0b9689ec2ce017595dffb256feaf6209f1462b6d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:57:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:57:44.216 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 18:57:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:57:44.217 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 18:57:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:57:44.217 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 18:57:44 compute-0 sudo[85713]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:44 compute-0 sudo[85865]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdmrlnekhvgyeruylymajxhdqeffgagy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949864.40204-134-15360586304369/AnsiballZ_file.py'
Oct 08 18:57:44 compute-0 sudo[85865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:45 compute-0 python3.9[85867]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:57:45 compute-0 sudo[85865]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:45 compute-0 python3.9[86017]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/iscsid.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 08 18:57:46 compute-0 sudo[86181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucvvqwnanklywuconwrywyumbtcolqzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949866.028987-151-264091066991192/AnsiballZ_lineinfile.py'
Oct 08 18:57:46 compute-0 sudo[86181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:46 compute-0 podman[86143]: 2025-10-08 18:57:46.581995367 +0000 UTC m=+0.091295682 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Oct 08 18:57:46 compute-0 python3.9[86187]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:57:46 compute-0 sudo[86181]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:47 compute-0 sudo[86353]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cllxqjrjevvwpukduwfqzponyekmycuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949867.0650446-160-211108728388676/AnsiballZ_file.py'
Oct 08 18:57:47 compute-0 sudo[86353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:47 compute-0 podman[86314]: 2025-10-08 18:57:47.465884509 +0000 UTC m=+0.101963430 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 08 18:57:47 compute-0 python3.9[86361]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:57:47 compute-0 sudo[86353]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:48 compute-0 sudo[86518]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrroggbbystkldzfsteiftrjuhyaqgen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949867.8395092-168-138588139979119/AnsiballZ_stat.py'
Oct 08 18:57:48 compute-0 sudo[86518]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:48 compute-0 python3.9[86520]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:57:48 compute-0 sudo[86518]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:48 compute-0 sudo[86596]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qnxsxxrxyceslbjtheanpndxsfrkjzjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949867.8395092-168-138588139979119/AnsiballZ_file.py'
Oct 08 18:57:48 compute-0 sudo[86596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:48 compute-0 python3.9[86598]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:57:48 compute-0 sudo[86596]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:49 compute-0 sudo[86748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xekcgytrlushfbedqsubcdwhdhcithzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949869.0722582-168-262606068954446/AnsiballZ_stat.py'
Oct 08 18:57:49 compute-0 sudo[86748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:49 compute-0 python3.9[86750]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:57:49 compute-0 sudo[86748]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:49 compute-0 sudo[86826]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nofywtrjbzpwgfedpdrskkyhdckzbidv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949869.0722582-168-262606068954446/AnsiballZ_file.py'
Oct 08 18:57:49 compute-0 sudo[86826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:50 compute-0 python3.9[86828]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:57:50 compute-0 sudo[86826]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:50 compute-0 sudo[86978]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfmdyztbzmxwgycoqcqvwtiezdgbgbqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949870.3522527-191-240034338844237/AnsiballZ_file.py'
Oct 08 18:57:50 compute-0 sudo[86978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:50 compute-0 python3.9[86980]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:57:50 compute-0 sudo[86978]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:51 compute-0 sudo[87130]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndblgtknsatqfleucztetiwkunuybadu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949871.090551-199-278178062636675/AnsiballZ_stat.py'
Oct 08 18:57:51 compute-0 sudo[87130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:51 compute-0 python3.9[87132]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:57:51 compute-0 sudo[87130]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:52 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct 08 18:57:52 compute-0 sudo[87208]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhconmrwycyxzovzomrgcftersudaarn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949871.090551-199-278178062636675/AnsiballZ_file.py'
Oct 08 18:57:52 compute-0 sudo[87208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:52 compute-0 python3.9[87210]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:57:52 compute-0 sudo[87208]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:52 compute-0 sudo[87360]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcjayrmfxtfsvxsqamdrhdzfhkjblsdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949872.5292618-211-115815755233753/AnsiballZ_stat.py'
Oct 08 18:57:52 compute-0 sudo[87360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:53 compute-0 python3.9[87362]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:57:53 compute-0 sudo[87360]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:53 compute-0 sudo[87438]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skamgwpgbsmeaxwldwigsjyxymvgnwen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949872.5292618-211-115815755233753/AnsiballZ_file.py'
Oct 08 18:57:53 compute-0 sudo[87438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:53 compute-0 python3.9[87440]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:57:53 compute-0 sudo[87438]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:54 compute-0 sudo[87590]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vetvdzobgaepzryeolbzdgltfjgberzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949873.903957-223-183760843539792/AnsiballZ_systemd.py'
Oct 08 18:57:54 compute-0 sudo[87590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:54 compute-0 python3.9[87592]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 08 18:57:54 compute-0 systemd[1]: Reloading.
Oct 08 18:57:54 compute-0 systemd-sysv-generator[87622]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 18:57:54 compute-0 systemd-rc-local-generator[87619]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 18:57:54 compute-0 sudo[87590]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:55 compute-0 sudo[87778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxrrndulvfonbgwbntbyjhencjrqacqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949875.1805916-231-189323611944739/AnsiballZ_stat.py'
Oct 08 18:57:55 compute-0 sudo[87778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:55 compute-0 python3.9[87780]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:57:55 compute-0 sudo[87778]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:56 compute-0 sudo[87856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iefybcyunetzfnrxkzbdjzvlsccjkijv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949875.1805916-231-189323611944739/AnsiballZ_file.py'
Oct 08 18:57:56 compute-0 sudo[87856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:56 compute-0 python3.9[87858]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:57:56 compute-0 sudo[87856]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:56 compute-0 sudo[88008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpfmtngjfhvgwzpfauvgakrvhnojvnjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949876.4619482-243-69948898567909/AnsiballZ_stat.py'
Oct 08 18:57:56 compute-0 sudo[88008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:57 compute-0 python3.9[88010]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:57:57 compute-0 sudo[88008]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:57 compute-0 sudo[88086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgurowzassudfglbbdzvriagpzoneiys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949876.4619482-243-69948898567909/AnsiballZ_file.py'
Oct 08 18:57:57 compute-0 sudo[88086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:57 compute-0 python3.9[88088]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:57:57 compute-0 sudo[88086]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:58 compute-0 sudo[88238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wphivqptmmplzuphjmhvqhgsruoelvpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949877.7177906-255-65406704918456/AnsiballZ_systemd.py'
Oct 08 18:57:58 compute-0 sudo[88238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:58 compute-0 python3.9[88240]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 08 18:57:58 compute-0 systemd[1]: Reloading.
Oct 08 18:57:58 compute-0 systemd-rc-local-generator[88265]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 18:57:58 compute-0 systemd-sysv-generator[88270]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 18:57:58 compute-0 systemd[1]: Starting Create netns directory...
Oct 08 18:57:58 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 08 18:57:58 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 08 18:57:58 compute-0 systemd[1]: Finished Create netns directory.
Oct 08 18:57:58 compute-0 sudo[88238]: pam_unix(sudo:session): session closed for user root
Oct 08 18:57:59 compute-0 sudo[88430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-votksvhswsaetxzrdqafgwbpbyueixku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949879.03615-265-74349768852428/AnsiballZ_file.py'
Oct 08 18:57:59 compute-0 sudo[88430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:57:59 compute-0 python3.9[88432]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:57:59 compute-0 sudo[88430]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:00 compute-0 sudo[88582]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrjfebecuebbpdbtzxuybpskienyxzkj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949879.8423088-273-112768493057349/AnsiballZ_stat.py'
Oct 08 18:58:00 compute-0 sudo[88582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:00 compute-0 python3.9[88584]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/iscsid/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:58:00 compute-0 sudo[88582]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:00 compute-0 sudo[88705]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhqwuwaizgtfleeluebrxczxypnhkyqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949879.8423088-273-112768493057349/AnsiballZ_copy.py'
Oct 08 18:58:00 compute-0 sudo[88705]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:00 compute-0 python3.9[88707]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/iscsid/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759949879.8423088-273-112768493057349/.source _original_basename=healthcheck follow=False checksum=2e1237e7fe015c809b173c52e24cfb87132f4344 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:58:01 compute-0 sudo[88705]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:01 compute-0 sudo[88857]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlvximyyknnmekdfyfeglnzmubtitndn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949881.4293985-290-126458819682928/AnsiballZ_file.py'
Oct 08 18:58:01 compute-0 sudo[88857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:01 compute-0 python3.9[88859]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:58:01 compute-0 sudo[88857]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:02 compute-0 sudo[89009]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uchyslurgioyxfkhrnyypikdkxnwxzdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949882.2138197-298-181482929234274/AnsiballZ_stat.py'
Oct 08 18:58:02 compute-0 sudo[89009]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:02 compute-0 python3.9[89011]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/iscsid.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:58:02 compute-0 sudo[89009]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:03 compute-0 sudo[89132]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkexmzqnknsjajbrpsqnrpoxukoeqcve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949882.2138197-298-181482929234274/AnsiballZ_copy.py'
Oct 08 18:58:03 compute-0 sudo[89132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:03 compute-0 python3.9[89134]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/iscsid.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759949882.2138197-298-181482929234274/.source.json _original_basename=.60b0gzvx follow=False checksum=80e4f97460718c7e5c66b21ef8b846eba0e0dbc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:58:03 compute-0 sudo[89132]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:03 compute-0 sudo[89284]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yclgsomcsgnszrxdgszussmccxhddqqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949883.538044-313-91315145724422/AnsiballZ_file.py'
Oct 08 18:58:03 compute-0 sudo[89284]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:04 compute-0 python3.9[89286]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/iscsid state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:58:04 compute-0 sudo[89284]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:04 compute-0 sudo[89436]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccrtudketumyzxxrentjnmiuihreiyfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949884.3279676-321-22309247705610/AnsiballZ_stat.py'
Oct 08 18:58:04 compute-0 sudo[89436]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:04 compute-0 sudo[89436]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:05 compute-0 sudo[89559]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mapiojnbjcdatcmfrtuzayjsmuzejbpd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949884.3279676-321-22309247705610/AnsiballZ_copy.py'
Oct 08 18:58:05 compute-0 sudo[89559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:05 compute-0 sudo[89559]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:06 compute-0 sudo[89711]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozivdkqycevqzkqjgaobvjvfeyxvibzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949885.8170106-338-251148120820346/AnsiballZ_container_config_data.py'
Oct 08 18:58:06 compute-0 sudo[89711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:06 compute-0 python3.9[89713]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/iscsid config_pattern=*.json debug=False
Oct 08 18:58:06 compute-0 sudo[89711]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:07 compute-0 sudo[89863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrxejjyhdovypwbnuigymgohaktiyiwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949886.817847-347-270188916549362/AnsiballZ_container_config_hash.py'
Oct 08 18:58:07 compute-0 sudo[89863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:07 compute-0 python3.9[89865]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 08 18:58:07 compute-0 sudo[89863]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:08 compute-0 sudo[90015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzazagndtdrrnokfzswqdwinjcjugsci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949887.7945154-356-66661677299553/AnsiballZ_podman_container_info.py'
Oct 08 18:58:08 compute-0 sudo[90015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:08 compute-0 python3.9[90017]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct 08 18:58:08 compute-0 sudo[90015]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:09 compute-0 sudo[90193]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihkytwncqqqrurlubrtanjhojqqgsusa ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759949889.2686787-369-142354156916459/AnsiballZ_edpm_container_manage.py'
Oct 08 18:58:09 compute-0 sudo[90193]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:10 compute-0 python3[90195]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/iscsid config_id=iscsid config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct 08 18:58:10 compute-0 podman[90232]: 2025-10-08 18:58:10.443302425 +0000 UTC m=+0.078584449 container create 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 08 18:58:10 compute-0 podman[90232]: 2025-10-08 18:58:10.404403497 +0000 UTC m=+0.039685561 image pull 74877095db294c27659f24e7f86074178a6f28eee68561c30e3ce4d18519e09c quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f
Oct 08 18:58:10 compute-0 python3[90195]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name iscsid --conmon-pidfile /run/iscsid.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=iscsid --label container_name=iscsid --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run:/run --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:z --volume /etc/target:/etc/target:z --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /var/lib/openstack/healthchecks/iscsid:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f
Oct 08 18:58:10 compute-0 sudo[90193]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:11 compute-0 sudo[90419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rodwdpoxwoogaubatsjoogtizeaegoyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949890.8350828-377-137549466085685/AnsiballZ_stat.py'
Oct 08 18:58:11 compute-0 sudo[90419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:11 compute-0 python3.9[90421]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 08 18:58:11 compute-0 sudo[90419]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:12 compute-0 sudo[90573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbqwlqgfisgudclmpnuyaciztigicpay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949891.694991-386-245343355889798/AnsiballZ_file.py'
Oct 08 18:58:12 compute-0 sudo[90573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:12 compute-0 python3.9[90575]: ansible-file Invoked with path=/etc/systemd/system/edpm_iscsid.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:58:12 compute-0 sudo[90573]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:12 compute-0 sudo[90649]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjwymchrdctfidrmjzsbttehnubpacsu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949891.694991-386-245343355889798/AnsiballZ_stat.py'
Oct 08 18:58:12 compute-0 sudo[90649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:12 compute-0 python3.9[90651]: ansible-stat Invoked with path=/etc/systemd/system/edpm_iscsid_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 08 18:58:12 compute-0 sudo[90649]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:13 compute-0 sudo[90800]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drfmxuvciksmazojjzdywbawpnpzikxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949892.8219678-386-61766793258991/AnsiballZ_copy.py'
Oct 08 18:58:13 compute-0 sudo[90800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:13 compute-0 python3.9[90802]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759949892.8219678-386-61766793258991/source dest=/etc/systemd/system/edpm_iscsid.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:58:13 compute-0 sudo[90800]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:13 compute-0 sudo[90876]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afrkjdtxbgbezgcdeuccofzwivqkulis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949892.8219678-386-61766793258991/AnsiballZ_systemd.py'
Oct 08 18:58:13 compute-0 sudo[90876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:14 compute-0 python3.9[90878]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 08 18:58:14 compute-0 systemd[1]: Reloading.
Oct 08 18:58:14 compute-0 systemd-rc-local-generator[90902]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 18:58:14 compute-0 systemd-sysv-generator[90906]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 18:58:14 compute-0 sudo[90876]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:14 compute-0 sudo[90986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdtxpxgbjdnmjjantmpruxbnpswgpoig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949892.8219678-386-61766793258991/AnsiballZ_systemd.py'
Oct 08 18:58:14 compute-0 sudo[90986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:15 compute-0 python3.9[90988]: ansible-systemd Invoked with state=restarted name=edpm_iscsid.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 08 18:58:15 compute-0 systemd[1]: Reloading.
Oct 08 18:58:15 compute-0 systemd-rc-local-generator[91019]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 18:58:15 compute-0 systemd-sysv-generator[91023]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 18:58:15 compute-0 systemd[1]: Starting iscsid container...
Oct 08 18:58:15 compute-0 systemd[1]: Started libcrun container.
Oct 08 18:58:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/464a574f8cabd9aa5d75cf9f09985ec149dd35110e8bee7783d53abdfa52b1bf/merged/etc/target supports timestamps until 2038 (0x7fffffff)
Oct 08 18:58:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/464a574f8cabd9aa5d75cf9f09985ec149dd35110e8bee7783d53abdfa52b1bf/merged/etc/iscsi supports timestamps until 2038 (0x7fffffff)
Oct 08 18:58:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/464a574f8cabd9aa5d75cf9f09985ec149dd35110e8bee7783d53abdfa52b1bf/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct 08 18:58:15 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845.
Oct 08 18:58:15 compute-0 podman[91028]: 2025-10-08 18:58:15.723721289 +0000 UTC m=+0.163008194 container init 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.license=GPLv2, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Oct 08 18:58:15 compute-0 iscsid[91044]: + sudo -E kolla_set_configs
Oct 08 18:58:15 compute-0 podman[91028]: 2025-10-08 18:58:15.754911045 +0000 UTC m=+0.194197920 container start 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=iscsid)
Oct 08 18:58:15 compute-0 podman[91028]: iscsid
Oct 08 18:58:15 compute-0 systemd[1]: Started iscsid container.
Oct 08 18:58:15 compute-0 sudo[91051]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Oct 08 18:58:15 compute-0 systemd[1]: Created slice User Slice of UID 0.
Oct 08 18:58:15 compute-0 systemd[1]: Starting User Runtime Directory /run/user/0...
Oct 08 18:58:15 compute-0 sudo[90986]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:15 compute-0 systemd[1]: Finished User Runtime Directory /run/user/0.
Oct 08 18:58:15 compute-0 systemd[1]: Starting User Manager for UID 0...
Oct 08 18:58:15 compute-0 systemd[91070]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Oct 08 18:58:15 compute-0 podman[91050]: 2025-10-08 18:58:15.873280025 +0000 UTC m=+0.096912345 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=starting, health_failing_streak=1, health_log=, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=iscsid)
Oct 08 18:58:15 compute-0 systemd[1]: 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845-211134ce0ad75163.service: Main process exited, code=exited, status=1/FAILURE
Oct 08 18:58:15 compute-0 systemd[1]: 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845-211134ce0ad75163.service: Failed with result 'exit-code'.
Oct 08 18:58:15 compute-0 systemd[91070]: Queued start job for default target Main User Target.
Oct 08 18:58:15 compute-0 systemd[91070]: Created slice User Application Slice.
Oct 08 18:58:15 compute-0 systemd[91070]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Oct 08 18:58:15 compute-0 systemd[91070]: Started Daily Cleanup of User's Temporary Directories.
Oct 08 18:58:15 compute-0 systemd[91070]: Reached target Paths.
Oct 08 18:58:15 compute-0 systemd[91070]: Reached target Timers.
Oct 08 18:58:15 compute-0 systemd[91070]: Starting D-Bus User Message Bus Socket...
Oct 08 18:58:15 compute-0 systemd[91070]: Starting Create User's Volatile Files and Directories...
Oct 08 18:58:16 compute-0 systemd[91070]: Listening on D-Bus User Message Bus Socket.
Oct 08 18:58:16 compute-0 systemd[91070]: Reached target Sockets.
Oct 08 18:58:16 compute-0 systemd[91070]: Finished Create User's Volatile Files and Directories.
Oct 08 18:58:16 compute-0 systemd[91070]: Reached target Basic System.
Oct 08 18:58:16 compute-0 systemd[91070]: Reached target Main User Target.
Oct 08 18:58:16 compute-0 systemd[91070]: Startup finished in 136ms.
Oct 08 18:58:16 compute-0 systemd[1]: Started User Manager for UID 0.
Oct 08 18:58:16 compute-0 systemd[1]: Started Session c3 of User root.
Oct 08 18:58:16 compute-0 sudo[91051]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 08 18:58:16 compute-0 iscsid[91044]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 08 18:58:16 compute-0 iscsid[91044]: INFO:__main__:Validating config file
Oct 08 18:58:16 compute-0 iscsid[91044]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 08 18:58:16 compute-0 iscsid[91044]: INFO:__main__:Writing out command to execute
Oct 08 18:58:16 compute-0 sudo[91051]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:16 compute-0 systemd[1]: session-c3.scope: Deactivated successfully.
Oct 08 18:58:16 compute-0 iscsid[91044]: ++ cat /run_command
Oct 08 18:58:16 compute-0 iscsid[91044]: + CMD='/usr/sbin/iscsid -f'
Oct 08 18:58:16 compute-0 iscsid[91044]: + ARGS=
Oct 08 18:58:16 compute-0 iscsid[91044]: + sudo kolla_copy_cacerts
Oct 08 18:58:16 compute-0 sudo[91170]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Oct 08 18:58:16 compute-0 systemd[1]: Started Session c4 of User root.
Oct 08 18:58:16 compute-0 sudo[91170]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 08 18:58:16 compute-0 sudo[91170]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:16 compute-0 systemd[1]: session-c4.scope: Deactivated successfully.
Oct 08 18:58:16 compute-0 iscsid[91044]: + [[ ! -n '' ]]
Oct 08 18:58:16 compute-0 iscsid[91044]: + . kolla_extend_start
Oct 08 18:58:16 compute-0 iscsid[91044]: ++ [[ ! -f /etc/iscsi/initiatorname.iscsi ]]
Oct 08 18:58:16 compute-0 iscsid[91044]: + echo 'Running command: '\''/usr/sbin/iscsid -f'\'''
Oct 08 18:58:16 compute-0 iscsid[91044]: + umask 0022
Oct 08 18:58:16 compute-0 iscsid[91044]: Running command: '/usr/sbin/iscsid -f'
Oct 08 18:58:16 compute-0 iscsid[91044]: + exec /usr/sbin/iscsid -f
Oct 08 18:58:16 compute-0 kernel: Loading iSCSI transport class v2.0-870.
Oct 08 18:58:16 compute-0 python3.9[91249]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.iscsid_restart_required follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 08 18:58:17 compute-0 sudo[91410]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bergozbejymircgfbjirhgicwrykhqug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949896.7332363-423-160428652312029/AnsiballZ_file.py'
Oct 08 18:58:17 compute-0 sudo[91410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:17 compute-0 podman[91373]: 2025-10-08 18:58:17.11581052 +0000 UTC m=+0.107595572 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct 08 18:58:17 compute-0 python3.9[91420]: ansible-ansible.builtin.file Invoked with path=/etc/iscsi/.iscsid_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:58:17 compute-0 sudo[91410]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:17 compute-0 podman[91448]: 2025-10-08 18:58:17.682904072 +0000 UTC m=+0.106446919 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Oct 08 18:58:17 compute-0 sudo[91601]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmhmnbqesyejcvjpqhmvvhugkbsluhkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949897.633105-434-133488675370315/AnsiballZ_service_facts.py'
Oct 08 18:58:17 compute-0 sudo[91601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:18 compute-0 python3.9[91603]: ansible-ansible.builtin.service_facts Invoked
Oct 08 18:58:18 compute-0 network[91620]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 08 18:58:18 compute-0 network[91621]: 'network-scripts' will be removed from distribution in near future.
Oct 08 18:58:18 compute-0 network[91622]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 08 18:58:22 compute-0 sudo[91601]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:23 compute-0 sudo[91894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-napiexevlgsrzkgdivlereygqtpudrid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949902.8682766-444-51743879659780/AnsiballZ_file.py'
Oct 08 18:58:23 compute-0 sudo[91894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:23 compute-0 python3.9[91896]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct 08 18:58:23 compute-0 sudo[91894]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:24 compute-0 sudo[92046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plqcidmiuvlqwkejesbmxudrwourslwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949903.5913568-452-27174804929939/AnsiballZ_modprobe.py'
Oct 08 18:58:24 compute-0 sudo[92046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:24 compute-0 python3.9[92048]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Oct 08 18:58:24 compute-0 sudo[92046]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:24 compute-0 sudo[92202]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrstklozdgbzelquusvhfchxjsdfphnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949904.4636195-460-218523749388223/AnsiballZ_stat.py'
Oct 08 18:58:24 compute-0 sudo[92202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:25 compute-0 python3.9[92204]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:58:25 compute-0 sudo[92202]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:25 compute-0 sudo[92325]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-keaaclygjauehwsfjhyxaynecddtthgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949904.4636195-460-218523749388223/AnsiballZ_copy.py'
Oct 08 18:58:25 compute-0 sudo[92325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:25 compute-0 python3.9[92327]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759949904.4636195-460-218523749388223/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:58:25 compute-0 sudo[92325]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:26 compute-0 systemd[1]: Stopping User Manager for UID 0...
Oct 08 18:58:26 compute-0 systemd[91070]: Activating special unit Exit the Session...
Oct 08 18:58:26 compute-0 systemd[91070]: Stopped target Main User Target.
Oct 08 18:58:26 compute-0 systemd[91070]: Stopped target Basic System.
Oct 08 18:58:26 compute-0 systemd[91070]: Stopped target Paths.
Oct 08 18:58:26 compute-0 systemd[91070]: Stopped target Sockets.
Oct 08 18:58:26 compute-0 systemd[91070]: Stopped target Timers.
Oct 08 18:58:26 compute-0 systemd[91070]: Stopped Daily Cleanup of User's Temporary Directories.
Oct 08 18:58:26 compute-0 systemd[91070]: Closed D-Bus User Message Bus Socket.
Oct 08 18:58:26 compute-0 systemd[91070]: Stopped Create User's Volatile Files and Directories.
Oct 08 18:58:26 compute-0 systemd[91070]: Removed slice User Application Slice.
Oct 08 18:58:26 compute-0 systemd[91070]: Reached target Shutdown.
Oct 08 18:58:26 compute-0 systemd[91070]: Finished Exit the Session.
Oct 08 18:58:26 compute-0 systemd[91070]: Reached target Exit the Session.
Oct 08 18:58:26 compute-0 systemd[1]: user@0.service: Deactivated successfully.
Oct 08 18:58:26 compute-0 systemd[1]: Stopped User Manager for UID 0.
Oct 08 18:58:26 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/0...
Oct 08 18:58:26 compute-0 systemd[1]: run-user-0.mount: Deactivated successfully.
Oct 08 18:58:26 compute-0 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Oct 08 18:58:26 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/0.
Oct 08 18:58:26 compute-0 systemd[1]: Removed slice User Slice of UID 0.
Oct 08 18:58:26 compute-0 sudo[92478]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amuhbddcyipwpfhkzspuxkstfkgaohez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949906.0444272-476-207761424271407/AnsiballZ_lineinfile.py'
Oct 08 18:58:26 compute-0 sudo[92478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:26 compute-0 python3.9[92480]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:58:26 compute-0 sudo[92478]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:27 compute-0 sudo[92630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjzncxzpttmqasbfmzkqwcgvqsedvplc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949906.7773194-484-176792818943689/AnsiballZ_systemd.py'
Oct 08 18:58:27 compute-0 sudo[92630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:27 compute-0 python3.9[92632]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 08 18:58:27 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Oct 08 18:58:27 compute-0 systemd[1]: Stopped Load Kernel Modules.
Oct 08 18:58:27 compute-0 systemd[1]: Stopping Load Kernel Modules...
Oct 08 18:58:27 compute-0 systemd[1]: Starting Load Kernel Modules...
Oct 08 18:58:27 compute-0 systemd[1]: Finished Load Kernel Modules.
Oct 08 18:58:27 compute-0 sudo[92630]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:28 compute-0 sudo[92786]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnhqkxxiyfqhlbdkxuyvwjghnuoimrlk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949907.7804906-492-234351746928290/AnsiballZ_file.py'
Oct 08 18:58:28 compute-0 sudo[92786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:28 compute-0 python3.9[92788]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:58:28 compute-0 sudo[92786]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:28 compute-0 sudo[92938]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxxnpdsmhsxigyhuumrmantzfkfmciti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949908.6078546-501-206935312439013/AnsiballZ_stat.py'
Oct 08 18:58:28 compute-0 sudo[92938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:29 compute-0 python3.9[92940]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 08 18:58:29 compute-0 sudo[92938]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:29 compute-0 sudo[93090]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-solpmwtslcxofejpswcpfcooxjmodjrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949909.2873366-510-200200433510847/AnsiballZ_stat.py'
Oct 08 18:58:29 compute-0 sudo[93090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:29 compute-0 python3.9[93092]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 08 18:58:29 compute-0 sudo[93090]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:30 compute-0 sudo[93242]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxjwxveibwyvtqtbzuzhugktujwelglr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949910.13523-518-187506763324848/AnsiballZ_stat.py'
Oct 08 18:58:30 compute-0 sudo[93242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:30 compute-0 python3.9[93244]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:58:30 compute-0 sudo[93242]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:31 compute-0 sudo[93365]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhyzgecvjzhejkaaxxiqnqgbkrfxxldd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949910.13523-518-187506763324848/AnsiballZ_copy.py'
Oct 08 18:58:31 compute-0 sudo[93365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:31 compute-0 python3.9[93367]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759949910.13523-518-187506763324848/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:58:31 compute-0 sudo[93365]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:32 compute-0 sudo[93517]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqwhaythqtfbtbzzlldivrdzailmvzdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949911.4772012-533-146110988294819/AnsiballZ_command.py'
Oct 08 18:58:32 compute-0 sudo[93517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:32 compute-0 python3.9[93519]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 08 18:58:32 compute-0 sudo[93517]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:32 compute-0 sudo[93670]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdplvsirajnlfafysqllsovzliwvswqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949912.4844508-541-38957159936616/AnsiballZ_lineinfile.py'
Oct 08 18:58:32 compute-0 sudo[93670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:33 compute-0 python3.9[93672]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:58:33 compute-0 sudo[93670]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:33 compute-0 sudo[93822]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnzdvajmifxctvpbcaaektaxikxcftpb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949913.2533448-549-180713973311073/AnsiballZ_replace.py'
Oct 08 18:58:33 compute-0 sudo[93822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:33 compute-0 python3.9[93824]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:58:33 compute-0 sudo[93822]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:34 compute-0 sudo[93974]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmobznkgmzxwuwqdarwxxjhpeplbflpm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949914.170926-557-473471462004/AnsiballZ_replace.py'
Oct 08 18:58:34 compute-0 sudo[93974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:34 compute-0 python3.9[93976]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:58:34 compute-0 sudo[93974]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:35 compute-0 sudo[94126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktuhuextgwqfftibmpvjogimhglaggnw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949915.0459802-566-148684144240510/AnsiballZ_lineinfile.py'
Oct 08 18:58:35 compute-0 sudo[94126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:35 compute-0 python3.9[94128]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:58:35 compute-0 sudo[94126]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:36 compute-0 sudo[94278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mahgevwfufxqbhpxmgcsuijiqmkvqdin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949915.765567-566-74836299553919/AnsiballZ_lineinfile.py'
Oct 08 18:58:36 compute-0 sudo[94278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:36 compute-0 python3.9[94280]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:58:36 compute-0 sudo[94278]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:36 compute-0 sudo[94430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-duhdfaskoiwmsvracihoeswrrpwdqeqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949916.4902937-566-25209723919507/AnsiballZ_lineinfile.py'
Oct 08 18:58:36 compute-0 sudo[94430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:37 compute-0 python3.9[94432]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:58:37 compute-0 sudo[94430]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:37 compute-0 sudo[94582]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqokjfdfprmzakokqtgpwegfjuozdtqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949917.2529397-566-70498234337173/AnsiballZ_lineinfile.py'
Oct 08 18:58:37 compute-0 sudo[94582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:37 compute-0 python3.9[94584]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:58:37 compute-0 sudo[94582]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:38 compute-0 sudo[94734]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvqjixqsvjduodknejewxqgzbyyyuppc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949918.0535574-595-189379677550429/AnsiballZ_stat.py'
Oct 08 18:58:38 compute-0 sudo[94734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:38 compute-0 python3.9[94736]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 08 18:58:38 compute-0 sudo[94734]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:39 compute-0 sudo[94888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xumfqaqkvkxvoyciuxyzzeimknvqcdcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949918.9024327-603-98712766463647/AnsiballZ_file.py'
Oct 08 18:58:39 compute-0 sudo[94888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:39 compute-0 python3.9[94890]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:58:39 compute-0 sudo[94888]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:40 compute-0 sudo[95040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hczdyimaxbkxfwangiybjzihjmsmxeaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949919.7415915-612-177045091744546/AnsiballZ_file.py'
Oct 08 18:58:40 compute-0 sudo[95040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:40 compute-0 python3.9[95042]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:58:40 compute-0 sudo[95040]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:40 compute-0 sudo[95192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xetgcxgpnelfdykvpbetlcevkgqqzkgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949920.5247304-620-195706665076624/AnsiballZ_stat.py'
Oct 08 18:58:40 compute-0 sudo[95192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:41 compute-0 python3.9[95194]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:58:41 compute-0 sudo[95192]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:41 compute-0 sudo[95270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcpmrwjaitejuutsvictmgqaswmbytzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949920.5247304-620-195706665076624/AnsiballZ_file.py'
Oct 08 18:58:41 compute-0 sudo[95270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:41 compute-0 python3.9[95272]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:58:41 compute-0 sudo[95270]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:41 compute-0 sudo[95422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjnlmzwmtqqizujfdtfmxwsilbooyfbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949921.6745691-620-207211712819575/AnsiballZ_stat.py'
Oct 08 18:58:41 compute-0 sudo[95422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:42 compute-0 python3.9[95424]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:58:42 compute-0 sudo[95422]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:42 compute-0 sudo[95500]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uiwzkrhllkwmmlqpudxiuidxxqxqgtkf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949921.6745691-620-207211712819575/AnsiballZ_file.py'
Oct 08 18:58:42 compute-0 sudo[95500]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:42 compute-0 python3.9[95502]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:58:42 compute-0 sudo[95500]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:43 compute-0 sudo[95652]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvnjepjxxrbohcykdkbkaclaojyujgoo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949922.8314333-643-41234981235550/AnsiballZ_file.py'
Oct 08 18:58:43 compute-0 sudo[95652]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:43 compute-0 python3.9[95654]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:58:43 compute-0 sudo[95652]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:43 compute-0 sudo[95804]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oywfkwhbfxsusvrispemtejyfvtgcqdm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949923.5908167-651-115149333408074/AnsiballZ_stat.py'
Oct 08 18:58:43 compute-0 sudo[95804]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:44 compute-0 python3.9[95806]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:58:44 compute-0 sudo[95804]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:58:44.217 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 18:58:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:58:44.219 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 18:58:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:58:44.219 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 18:58:44 compute-0 sudo[95882]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qintpijwuphktgdbsgrxtmtbvvjwqhoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949923.5908167-651-115149333408074/AnsiballZ_file.py'
Oct 08 18:58:44 compute-0 sudo[95882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:44 compute-0 python3.9[95884]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:58:44 compute-0 sudo[95882]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:45 compute-0 sudo[96034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwbwpriepgilnlxvsctaqbmuxdlhdzcl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949924.9268513-663-68365590072911/AnsiballZ_stat.py'
Oct 08 18:58:45 compute-0 sudo[96034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:45 compute-0 systemd[1]: virtnodedevd.service: Deactivated successfully.
Oct 08 18:58:45 compute-0 python3.9[96036]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:58:45 compute-0 sudo[96034]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:45 compute-0 sudo[96113]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddptjwhnqexdsfmecqahxaszitieeioc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949924.9268513-663-68365590072911/AnsiballZ_file.py'
Oct 08 18:58:45 compute-0 sudo[96113]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:46 compute-0 python3.9[96115]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:58:46 compute-0 sudo[96113]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:46 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Oct 08 18:58:46 compute-0 podman[96218]: 2025-10-08 18:58:46.679144403 +0000 UTC m=+0.092328102 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct 08 18:58:46 compute-0 sudo[96287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebezhqdxbbvmjzjbbjoejpgvgcsflyul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949926.3013833-675-123169558968869/AnsiballZ_systemd.py'
Oct 08 18:58:46 compute-0 sudo[96287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:47 compute-0 python3.9[96290]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 08 18:58:47 compute-0 systemd[1]: Reloading.
Oct 08 18:58:47 compute-0 systemd-rc-local-generator[96316]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 18:58:47 compute-0 systemd-sysv-generator[96321]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 18:58:47 compute-0 sudo[96287]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:47 compute-0 podman[96326]: 2025-10-08 18:58:47.486980481 +0000 UTC m=+0.083672365 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 08 18:58:47 compute-0 systemd[1]: virtqemud.service: Deactivated successfully.
Oct 08 18:58:48 compute-0 sudo[96513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjbhqkkvhwlowltzzzjczzgrxblttevb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949927.6691616-683-219204286091464/AnsiballZ_stat.py'
Oct 08 18:58:48 compute-0 sudo[96513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:48 compute-0 podman[96471]: 2025-10-08 18:58:48.140693641 +0000 UTC m=+0.175220885 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_controller)
Oct 08 18:58:48 compute-0 python3.9[96521]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:58:48 compute-0 sudo[96513]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:48 compute-0 sudo[96602]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofpphfmauhfkxurgzzgxqmubfdtcjbii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949927.6691616-683-219204286091464/AnsiballZ_file.py'
Oct 08 18:58:48 compute-0 sudo[96602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:48 compute-0 python3.9[96604]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:58:48 compute-0 sudo[96602]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:49 compute-0 systemd[1]: virtsecretd.service: Deactivated successfully.
Oct 08 18:58:49 compute-0 sudo[96755]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcirxdfcrwlxifzqyfcmikjzvpysiexl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949929.0260406-695-17760893964407/AnsiballZ_stat.py'
Oct 08 18:58:49 compute-0 sudo[96755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:49 compute-0 python3.9[96757]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:58:49 compute-0 sudo[96755]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:49 compute-0 sudo[96833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnpyhpfkwiglkgktqkgodgvuompgqlxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949929.0260406-695-17760893964407/AnsiballZ_file.py'
Oct 08 18:58:49 compute-0 sudo[96833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:50 compute-0 python3.9[96835]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:58:50 compute-0 sudo[96833]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:50 compute-0 sudo[96985]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-outqrmpjfhkaatiovbemcytfqmpvcvnr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949930.407911-707-32516657312738/AnsiballZ_systemd.py'
Oct 08 18:58:50 compute-0 sudo[96985]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:51 compute-0 python3.9[96987]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 08 18:58:51 compute-0 systemd[1]: Reloading.
Oct 08 18:58:51 compute-0 systemd-rc-local-generator[97016]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 18:58:51 compute-0 systemd-sysv-generator[97020]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 18:58:51 compute-0 systemd[1]: Starting Create netns directory...
Oct 08 18:58:51 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 08 18:58:51 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 08 18:58:51 compute-0 systemd[1]: Finished Create netns directory.
Oct 08 18:58:51 compute-0 sudo[96985]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:52 compute-0 sudo[97179]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dubmgjrsujvwmwzvyhacoxhvfqstissp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949931.896868-717-266741842660244/AnsiballZ_file.py'
Oct 08 18:58:52 compute-0 sudo[97179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:52 compute-0 python3.9[97181]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:58:52 compute-0 sudo[97179]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:53 compute-0 sudo[97331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bolcodrpnpejtbkxwovsjbbyihyjunoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949932.73868-725-31010283865741/AnsiballZ_stat.py'
Oct 08 18:58:53 compute-0 sudo[97331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:53 compute-0 python3.9[97333]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:58:53 compute-0 sudo[97331]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:53 compute-0 sudo[97454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdxsjqapyvivyuveukcnyxexmfqcnqno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949932.73868-725-31010283865741/AnsiballZ_copy.py'
Oct 08 18:58:53 compute-0 sudo[97454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:54 compute-0 python3.9[97456]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759949932.73868-725-31010283865741/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:58:54 compute-0 sudo[97454]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:54 compute-0 sudo[97606]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwossqbfrocykmoywriqivmmaonankby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949934.4122548-742-228064437543885/AnsiballZ_file.py'
Oct 08 18:58:54 compute-0 sudo[97606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:55 compute-0 python3.9[97608]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 08 18:58:55 compute-0 sudo[97606]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:55 compute-0 sudo[97758]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwbwrxmjgdpdomnripeepusjhruohixc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949935.3053358-750-226047257646264/AnsiballZ_stat.py'
Oct 08 18:58:55 compute-0 sudo[97758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:55 compute-0 python3.9[97760]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:58:55 compute-0 sudo[97758]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:56 compute-0 sudo[97881]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uiaaatysywaudhagomqlwibsyxbklemn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949935.3053358-750-226047257646264/AnsiballZ_copy.py'
Oct 08 18:58:56 compute-0 sudo[97881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:56 compute-0 python3.9[97883]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759949935.3053358-750-226047257646264/.source.json _original_basename=.13ncsuws follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:58:56 compute-0 sudo[97881]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:57 compute-0 sudo[98033]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfgyixfisfmhwwdyjcblholgngumabsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949936.8468635-765-188559005454140/AnsiballZ_file.py'
Oct 08 18:58:57 compute-0 sudo[98033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:57 compute-0 python3.9[98035]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:58:57 compute-0 sudo[98033]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:58 compute-0 sudo[98185]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iloxluqqueaqewzuteacfiwzairjhbnr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949937.6721108-773-139858782182983/AnsiballZ_stat.py'
Oct 08 18:58:58 compute-0 sudo[98185]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:58 compute-0 sudo[98185]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:58 compute-0 sudo[98308]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-clamfipfljyrijnoeykqzofljbluizpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949937.6721108-773-139858782182983/AnsiballZ_copy.py'
Oct 08 18:58:58 compute-0 sudo[98308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:58 compute-0 sudo[98308]: pam_unix(sudo:session): session closed for user root
Oct 08 18:58:59 compute-0 sudo[98460]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwcrvweswgbmxxycmhtxjiaypmlynnfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949939.1862035-790-23954325061941/AnsiballZ_container_config_data.py'
Oct 08 18:58:59 compute-0 sudo[98460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:58:59 compute-0 python3.9[98462]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Oct 08 18:58:59 compute-0 sudo[98460]: pam_unix(sudo:session): session closed for user root
Oct 08 18:59:00 compute-0 sudo[98612]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whhtjckmrxtrpksuwenhszhlnaqdgpbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949940.040996-799-89293231347760/AnsiballZ_container_config_hash.py'
Oct 08 18:59:00 compute-0 sudo[98612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:59:00 compute-0 python3.9[98614]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 08 18:59:00 compute-0 sudo[98612]: pam_unix(sudo:session): session closed for user root
Oct 08 18:59:01 compute-0 sudo[98764]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zuserfyuzncfbzzlnmonuffcoixudpef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949940.8909342-808-68358250129555/AnsiballZ_podman_container_info.py'
Oct 08 18:59:01 compute-0 sudo[98764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:59:01 compute-0 python3.9[98766]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct 08 18:59:01 compute-0 sudo[98764]: pam_unix(sudo:session): session closed for user root
Oct 08 18:59:02 compute-0 sudo[98942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chilgofjxxrbibneyiknztnornwtkphd ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759949942.160289-821-6512316168546/AnsiballZ_edpm_container_manage.py'
Oct 08 18:59:02 compute-0 sudo[98942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:59:02 compute-0 python3[98944]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct 08 18:59:02 compute-0 podman[98979]: 2025-10-08 18:59:02.930685017 +0000 UTC m=+0.054971480 container create 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, tcib_managed=true, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct 08 18:59:02 compute-0 podman[98979]: 2025-10-08 18:59:02.902881379 +0000 UTC m=+0.027167832 image pull f541ff382622bd8bc9ad206129d2a8e74c239ff4503fa3b67d3bdf6d5b50b511 quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43
Oct 08 18:59:02 compute-0 python3[98944]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43
Oct 08 18:59:03 compute-0 sudo[98942]: pam_unix(sudo:session): session closed for user root
Oct 08 18:59:03 compute-0 sudo[99168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncgewfjzrpysxrsldljwnjtjopfecgtu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949943.2830856-829-261099980745977/AnsiballZ_stat.py'
Oct 08 18:59:03 compute-0 sudo[99168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:59:03 compute-0 python3.9[99170]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 08 18:59:03 compute-0 sudo[99168]: pam_unix(sudo:session): session closed for user root
Oct 08 18:59:04 compute-0 sudo[99322]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxcufdlrptqpjlmboututagnjbyikxqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949944.1134086-838-2002114976239/AnsiballZ_file.py'
Oct 08 18:59:04 compute-0 sudo[99322]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:59:04 compute-0 python3.9[99324]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:59:04 compute-0 sudo[99322]: pam_unix(sudo:session): session closed for user root
Oct 08 18:59:04 compute-0 sudo[99398]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pufnpteonibmbdpwkpbiozwksgjjlmby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949944.1134086-838-2002114976239/AnsiballZ_stat.py'
Oct 08 18:59:04 compute-0 sudo[99398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:59:05 compute-0 python3.9[99400]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 08 18:59:05 compute-0 sudo[99398]: pam_unix(sudo:session): session closed for user root
Oct 08 18:59:05 compute-0 sudo[99549]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svhwwvgfjkwrtqaplzrtpomfqrbgeeva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949945.1748476-838-195206719469371/AnsiballZ_copy.py'
Oct 08 18:59:05 compute-0 sudo[99549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:59:05 compute-0 python3.9[99551]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759949945.1748476-838-195206719469371/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:59:05 compute-0 sudo[99549]: pam_unix(sudo:session): session closed for user root
Oct 08 18:59:06 compute-0 sudo[99625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fiekcjwallacosiobinbnsmsngcffqqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949945.1748476-838-195206719469371/AnsiballZ_systemd.py'
Oct 08 18:59:06 compute-0 sudo[99625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:59:06 compute-0 python3.9[99627]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 08 18:59:06 compute-0 systemd[1]: Reloading.
Oct 08 18:59:06 compute-0 systemd-rc-local-generator[99654]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 18:59:06 compute-0 systemd-sysv-generator[99657]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 18:59:06 compute-0 sudo[99625]: pam_unix(sudo:session): session closed for user root
Oct 08 18:59:07 compute-0 sudo[99735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnfbzosaksvgaxmttjadiasrnfrzdvhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949945.1748476-838-195206719469371/AnsiballZ_systemd.py'
Oct 08 18:59:07 compute-0 sudo[99735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:59:07 compute-0 python3.9[99737]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 08 18:59:07 compute-0 systemd[1]: Reloading.
Oct 08 18:59:07 compute-0 systemd-sysv-generator[99765]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 18:59:07 compute-0 systemd-rc-local-generator[99761]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 18:59:07 compute-0 systemd[1]: Starting multipathd container...
Oct 08 18:59:08 compute-0 systemd[1]: Started libcrun container.
Oct 08 18:59:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/517be04000c2e5d64e270c8a61cdaeaa39cf403d978930fe8216e7358faa20ca/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct 08 18:59:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/517be04000c2e5d64e270c8a61cdaeaa39cf403d978930fe8216e7358faa20ca/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct 08 18:59:08 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d.
Oct 08 18:59:08 compute-0 podman[99777]: 2025-10-08 18:59:08.083252179 +0000 UTC m=+0.129413080 container init 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible)
Oct 08 18:59:08 compute-0 multipathd[99792]: + sudo -E kolla_set_configs
Oct 08 18:59:08 compute-0 podman[99777]: 2025-10-08 18:59:08.108402192 +0000 UTC m=+0.154563063 container start 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, tcib_managed=true)
Oct 08 18:59:08 compute-0 podman[99777]: multipathd
Oct 08 18:59:08 compute-0 systemd[1]: Started multipathd container.
Oct 08 18:59:08 compute-0 sudo[99799]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Oct 08 18:59:08 compute-0 sudo[99799]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Oct 08 18:59:08 compute-0 sudo[99799]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 08 18:59:08 compute-0 sudo[99735]: pam_unix(sudo:session): session closed for user root
Oct 08 18:59:08 compute-0 multipathd[99792]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 08 18:59:08 compute-0 multipathd[99792]: INFO:__main__:Validating config file
Oct 08 18:59:08 compute-0 multipathd[99792]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 08 18:59:08 compute-0 multipathd[99792]: INFO:__main__:Writing out command to execute
Oct 08 18:59:08 compute-0 sudo[99799]: pam_unix(sudo:session): session closed for user root
Oct 08 18:59:08 compute-0 multipathd[99792]: ++ cat /run_command
Oct 08 18:59:08 compute-0 multipathd[99792]: + CMD='/usr/sbin/multipathd -d'
Oct 08 18:59:08 compute-0 multipathd[99792]: + ARGS=
Oct 08 18:59:08 compute-0 multipathd[99792]: + sudo kolla_copy_cacerts
Oct 08 18:59:08 compute-0 podman[99798]: 2025-10-08 18:59:08.231616894 +0000 UTC m=+0.100368295 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct 08 18:59:08 compute-0 systemd[1]: 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d-395fe86eb4747b73.service: Main process exited, code=exited, status=1/FAILURE
Oct 08 18:59:08 compute-0 systemd[1]: 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d-395fe86eb4747b73.service: Failed with result 'exit-code'.
Oct 08 18:59:08 compute-0 sudo[99843]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Oct 08 18:59:08 compute-0 sudo[99843]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Oct 08 18:59:08 compute-0 sudo[99843]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 08 18:59:08 compute-0 sudo[99843]: pam_unix(sudo:session): session closed for user root
Oct 08 18:59:08 compute-0 multipathd[99792]: + [[ ! -n '' ]]
Oct 08 18:59:08 compute-0 multipathd[99792]: + . kolla_extend_start
Oct 08 18:59:08 compute-0 multipathd[99792]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Oct 08 18:59:08 compute-0 multipathd[99792]: Running command: '/usr/sbin/multipathd -d'
Oct 08 18:59:08 compute-0 multipathd[99792]: + umask 0022
Oct 08 18:59:08 compute-0 multipathd[99792]: + exec /usr/sbin/multipathd -d
Oct 08 18:59:08 compute-0 multipathd[99792]: 618.898435 | --------start up--------
Oct 08 18:59:08 compute-0 multipathd[99792]: 618.898457 | read /etc/multipath.conf
Oct 08 18:59:08 compute-0 multipathd[99792]: 618.907779 | path checkers start up
Oct 08 18:59:08 compute-0 python3.9[99981]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 08 18:59:09 compute-0 sudo[100133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzwbjhxeoexxycscappyhzzcdmplwccy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949949.1818268-874-256950218874716/AnsiballZ_command.py'
Oct 08 18:59:09 compute-0 sudo[100133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:59:09 compute-0 python3.9[100135]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 08 18:59:09 compute-0 sudo[100133]: pam_unix(sudo:session): session closed for user root
Oct 08 18:59:10 compute-0 sudo[100298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwozarrcckbpydanhzfhcqhyyybeyxoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949950.0289445-882-70052930917331/AnsiballZ_systemd.py'
Oct 08 18:59:10 compute-0 sudo[100298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:59:10 compute-0 python3.9[100300]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 08 18:59:10 compute-0 systemd[1]: Stopping multipathd container...
Oct 08 18:59:10 compute-0 multipathd[99792]: 621.415308 | exit (signal)
Oct 08 18:59:10 compute-0 multipathd[99792]: 621.415372 | --------shut down-------
Oct 08 18:59:10 compute-0 systemd[1]: libpod-62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d.scope: Deactivated successfully.
Oct 08 18:59:10 compute-0 podman[100304]: 2025-10-08 18:59:10.817356538 +0000 UTC m=+0.073714830 container died 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible)
Oct 08 18:59:10 compute-0 systemd[1]: 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d-395fe86eb4747b73.timer: Deactivated successfully.
Oct 08 18:59:10 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d.
Oct 08 18:59:10 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d-userdata-shm.mount: Deactivated successfully.
Oct 08 18:59:10 compute-0 systemd[1]: var-lib-containers-storage-overlay-517be04000c2e5d64e270c8a61cdaeaa39cf403d978930fe8216e7358faa20ca-merged.mount: Deactivated successfully.
Oct 08 18:59:10 compute-0 podman[100304]: 2025-10-08 18:59:10.864953538 +0000 UTC m=+0.121311830 container cleanup 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd)
Oct 08 18:59:10 compute-0 podman[100304]: multipathd
Oct 08 18:59:10 compute-0 podman[100335]: multipathd
Oct 08 18:59:10 compute-0 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Oct 08 18:59:10 compute-0 systemd[1]: Stopped multipathd container.
Oct 08 18:59:10 compute-0 systemd[1]: Starting multipathd container...
Oct 08 18:59:11 compute-0 systemd[1]: Started libcrun container.
Oct 08 18:59:11 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/517be04000c2e5d64e270c8a61cdaeaa39cf403d978930fe8216e7358faa20ca/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct 08 18:59:11 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/517be04000c2e5d64e270c8a61cdaeaa39cf403d978930fe8216e7358faa20ca/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct 08 18:59:11 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d.
Oct 08 18:59:11 compute-0 podman[100348]: 2025-10-08 18:59:11.08019783 +0000 UTC m=+0.139724682 container init 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 08 18:59:11 compute-0 multipathd[100364]: + sudo -E kolla_set_configs
Oct 08 18:59:11 compute-0 podman[100348]: 2025-10-08 18:59:11.100987849 +0000 UTC m=+0.160514381 container start 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd)
Oct 08 18:59:11 compute-0 podman[100348]: multipathd
Oct 08 18:59:11 compute-0 sudo[100370]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Oct 08 18:59:11 compute-0 sudo[100370]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Oct 08 18:59:11 compute-0 systemd[1]: Started multipathd container.
Oct 08 18:59:11 compute-0 sudo[100370]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 08 18:59:11 compute-0 sudo[100298]: pam_unix(sudo:session): session closed for user root
Oct 08 18:59:11 compute-0 multipathd[100364]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 08 18:59:11 compute-0 multipathd[100364]: INFO:__main__:Validating config file
Oct 08 18:59:11 compute-0 multipathd[100364]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 08 18:59:11 compute-0 multipathd[100364]: INFO:__main__:Writing out command to execute
Oct 08 18:59:11 compute-0 sudo[100370]: pam_unix(sudo:session): session closed for user root
Oct 08 18:59:11 compute-0 multipathd[100364]: ++ cat /run_command
Oct 08 18:59:11 compute-0 multipathd[100364]: + CMD='/usr/sbin/multipathd -d'
Oct 08 18:59:11 compute-0 multipathd[100364]: + ARGS=
Oct 08 18:59:11 compute-0 multipathd[100364]: + sudo kolla_copy_cacerts
Oct 08 18:59:11 compute-0 podman[100371]: 2025-10-08 18:59:11.173932597 +0000 UTC m=+0.063465890 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd)
Oct 08 18:59:11 compute-0 sudo[100391]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Oct 08 18:59:11 compute-0 sudo[100391]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Oct 08 18:59:11 compute-0 sudo[100391]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 08 18:59:11 compute-0 systemd[1]: 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d-6de5b76987abf0f5.service: Main process exited, code=exited, status=1/FAILURE
Oct 08 18:59:11 compute-0 systemd[1]: 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d-6de5b76987abf0f5.service: Failed with result 'exit-code'.
Oct 08 18:59:11 compute-0 sudo[100391]: pam_unix(sudo:session): session closed for user root
Oct 08 18:59:11 compute-0 multipathd[100364]: + [[ ! -n '' ]]
Oct 08 18:59:11 compute-0 multipathd[100364]: + . kolla_extend_start
Oct 08 18:59:11 compute-0 multipathd[100364]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Oct 08 18:59:11 compute-0 multipathd[100364]: Running command: '/usr/sbin/multipathd -d'
Oct 08 18:59:11 compute-0 multipathd[100364]: + umask 0022
Oct 08 18:59:11 compute-0 multipathd[100364]: + exec /usr/sbin/multipathd -d
Oct 08 18:59:11 compute-0 multipathd[100364]: 621.827410 | --------start up--------
Oct 08 18:59:11 compute-0 multipathd[100364]: 621.827429 | read /etc/multipath.conf
Oct 08 18:59:11 compute-0 multipathd[100364]: 621.832177 | path checkers start up
Oct 08 18:59:11 compute-0 sudo[100550]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhugmoqymzwabtymfkjvkonrmpswpqld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949951.339539-890-207918793796862/AnsiballZ_file.py'
Oct 08 18:59:11 compute-0 sudo[100550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:59:11 compute-0 python3.9[100552]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:59:11 compute-0 sudo[100550]: pam_unix(sudo:session): session closed for user root
Oct 08 18:59:12 compute-0 sudo[100702]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwkzrpuwjddltshyiccrwfaipnowvuiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949952.2242858-902-120958970896621/AnsiballZ_file.py'
Oct 08 18:59:12 compute-0 sudo[100702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:59:12 compute-0 python3.9[100704]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct 08 18:59:12 compute-0 sudo[100702]: pam_unix(sudo:session): session closed for user root
Oct 08 18:59:13 compute-0 sudo[100854]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxgjigmvnvmphkvlncmvcwzhspkzmcbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949952.9545603-910-70185411669151/AnsiballZ_modprobe.py'
Oct 08 18:59:13 compute-0 sudo[100854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:59:13 compute-0 python3.9[100856]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Oct 08 18:59:13 compute-0 kernel: Key type psk registered
Oct 08 18:59:13 compute-0 sudo[100854]: pam_unix(sudo:session): session closed for user root
Oct 08 18:59:14 compute-0 sudo[101017]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qeydvoodadvqgvjualrybkpnxvsnsnkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949953.808738-918-15789161818488/AnsiballZ_stat.py'
Oct 08 18:59:14 compute-0 sudo[101017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:59:14 compute-0 python3.9[101019]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 18:59:14 compute-0 sudo[101017]: pam_unix(sudo:session): session closed for user root
Oct 08 18:59:14 compute-0 sudo[101140]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcencnlcckmrtzvnlujpltqjhdekdcdi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949953.808738-918-15789161818488/AnsiballZ_copy.py'
Oct 08 18:59:14 compute-0 sudo[101140]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:59:15 compute-0 python3.9[101142]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759949953.808738-918-15789161818488/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:59:15 compute-0 sudo[101140]: pam_unix(sudo:session): session closed for user root
Oct 08 18:59:15 compute-0 sudo[101292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpicedbxmhlwjvkjhhlbhpnslhrrqsld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949955.3401742-934-144046362435125/AnsiballZ_lineinfile.py'
Oct 08 18:59:15 compute-0 sudo[101292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:59:15 compute-0 python3.9[101294]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:59:15 compute-0 sudo[101292]: pam_unix(sudo:session): session closed for user root
Oct 08 18:59:16 compute-0 sudo[101444]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qczlgqpyamjsnecolugxeusjynnvnslv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949956.120439-942-263599002983803/AnsiballZ_systemd.py'
Oct 08 18:59:16 compute-0 sudo[101444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:59:16 compute-0 python3.9[101446]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 08 18:59:16 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Oct 08 18:59:16 compute-0 systemd[1]: Stopped Load Kernel Modules.
Oct 08 18:59:16 compute-0 systemd[1]: Stopping Load Kernel Modules...
Oct 08 18:59:16 compute-0 systemd[1]: Starting Load Kernel Modules...
Oct 08 18:59:16 compute-0 systemd[1]: Finished Load Kernel Modules.
Oct 08 18:59:16 compute-0 sudo[101444]: pam_unix(sudo:session): session closed for user root
Oct 08 18:59:16 compute-0 podman[101448]: 2025-10-08 18:59:16.929166222 +0000 UTC m=+0.098271087 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.vendor=CentOS)
Oct 08 18:59:17 compute-0 sudo[101618]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcroynjztlwyglgpnltrucdokceqgwsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949957.132118-950-226501514937692/AnsiballZ_setup.py'
Oct 08 18:59:17 compute-0 sudo[101618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:59:17 compute-0 podman[101620]: 2025-10-08 18:59:17.637956175 +0000 UTC m=+0.077767795 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct 08 18:59:17 compute-0 python3.9[101621]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 08 18:59:18 compute-0 sudo[101618]: pam_unix(sudo:session): session closed for user root
Oct 08 18:59:18 compute-0 sudo[101736]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbnldgnaaourjbucfyhqbnfmovbufeoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949957.132118-950-226501514937692/AnsiballZ_dnf.py'
Oct 08 18:59:18 compute-0 sudo[101736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:59:18 compute-0 podman[101688]: 2025-10-08 18:59:18.69705536 +0000 UTC m=+0.107542069 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 08 18:59:18 compute-0 python3.9[101743]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 08 18:59:25 compute-0 systemd[1]: Reloading.
Oct 08 18:59:25 compute-0 systemd-rc-local-generator[101775]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 18:59:25 compute-0 systemd-sysv-generator[101780]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 18:59:25 compute-0 systemd[1]: Reloading.
Oct 08 18:59:25 compute-0 systemd-rc-local-generator[101811]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 18:59:25 compute-0 systemd-sysv-generator[101818]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 18:59:25 compute-0 systemd-logind[844]: Watching system buttons on /dev/input/event0 (Power Button)
Oct 08 18:59:25 compute-0 systemd-logind[844]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Oct 08 18:59:26 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 08 18:59:26 compute-0 systemd[1]: Starting man-db-cache-update.service...
Oct 08 18:59:26 compute-0 systemd[1]: Reloading.
Oct 08 18:59:26 compute-0 systemd-rc-local-generator[101909]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 18:59:26 compute-0 systemd-sysv-generator[101912]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 18:59:26 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Oct 08 18:59:27 compute-0 sudo[101736]: pam_unix(sudo:session): session closed for user root
Oct 08 18:59:27 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 08 18:59:27 compute-0 systemd[1]: Finished man-db-cache-update.service.
Oct 08 18:59:27 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.569s CPU time.
Oct 08 18:59:27 compute-0 systemd[1]: run-rba63fb3783dd4aea876b6703da0c6061.service: Deactivated successfully.
Oct 08 18:59:27 compute-0 sudo[103195]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxhyzjjcweasdwwnkmkkkpfnfmhrhcph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949967.2311864-962-134840695249540/AnsiballZ_file.py'
Oct 08 18:59:27 compute-0 sudo[103195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:59:27 compute-0 python3.9[103198]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.iscsid_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:59:27 compute-0 sudo[103195]: pam_unix(sudo:session): session closed for user root
Oct 08 18:59:28 compute-0 python3.9[103348]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 08 18:59:29 compute-0 sudo[103502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvzqsflcrbktfhpzmawavktpcazjmpfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949969.1098945-980-105691568302407/AnsiballZ_file.py'
Oct 08 18:59:29 compute-0 sudo[103502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:59:29 compute-0 python3.9[103504]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:59:29 compute-0 sudo[103502]: pam_unix(sudo:session): session closed for user root
Oct 08 18:59:30 compute-0 sudo[103654]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syibnutgnadolmpizpfvjykzkmbomdke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949970.0568347-991-141644447801380/AnsiballZ_systemd_service.py'
Oct 08 18:59:30 compute-0 sudo[103654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:59:31 compute-0 python3.9[103656]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 08 18:59:31 compute-0 systemd[1]: Reloading.
Oct 08 18:59:31 compute-0 systemd-rc-local-generator[103685]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 18:59:31 compute-0 systemd-sysv-generator[103688]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 18:59:31 compute-0 sudo[103654]: pam_unix(sudo:session): session closed for user root
Oct 08 18:59:32 compute-0 python3.9[103842]: ansible-ansible.builtin.service_facts Invoked
Oct 08 18:59:32 compute-0 network[103859]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 08 18:59:32 compute-0 network[103860]: 'network-scripts' will be removed from distribution in near future.
Oct 08 18:59:32 compute-0 network[103861]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 08 18:59:36 compute-0 sudo[104136]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-somitwpwhnzsiymmchszvakqldnutyij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949976.4618638-1010-210817238621286/AnsiballZ_systemd_service.py'
Oct 08 18:59:36 compute-0 sudo[104136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:59:37 compute-0 python3.9[104138]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 08 18:59:37 compute-0 sudo[104136]: pam_unix(sudo:session): session closed for user root
Oct 08 18:59:37 compute-0 sudo[104289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqmxvcilvtveanefwbqvoaghdzijydjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949977.3283675-1010-151444253192309/AnsiballZ_systemd_service.py'
Oct 08 18:59:37 compute-0 sudo[104289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:59:37 compute-0 python3.9[104291]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 08 18:59:39 compute-0 sudo[104289]: pam_unix(sudo:session): session closed for user root
Oct 08 18:59:39 compute-0 sudo[104442]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nyhhfhfxqidttaulbbwftglxbplizvgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949979.2036362-1010-19157721306396/AnsiballZ_systemd_service.py'
Oct 08 18:59:39 compute-0 sudo[104442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:59:39 compute-0 python3.9[104444]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 08 18:59:40 compute-0 sudo[104442]: pam_unix(sudo:session): session closed for user root
Oct 08 18:59:41 compute-0 podman[104569]: 2025-10-08 18:59:41.370781193 +0000 UTC m=+0.064293784 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible)
Oct 08 18:59:41 compute-0 sudo[104611]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mzqxulkhlxvaekycqojbeikbafvqasch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949981.0300624-1010-227161625580102/AnsiballZ_systemd_service.py'
Oct 08 18:59:41 compute-0 sudo[104611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:59:41 compute-0 python3.9[104616]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 08 18:59:41 compute-0 sudo[104611]: pam_unix(sudo:session): session closed for user root
Oct 08 18:59:42 compute-0 sudo[104767]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocptsxydumwcyndyzrbhxxlltkshffak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949981.852763-1010-67810647189437/AnsiballZ_systemd_service.py'
Oct 08 18:59:42 compute-0 sudo[104767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:59:42 compute-0 python3.9[104769]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 08 18:59:42 compute-0 sudo[104767]: pam_unix(sudo:session): session closed for user root
Oct 08 18:59:42 compute-0 sudo[104920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvapkehggewjmwlsmseehxfudbyodpte ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949982.6391072-1010-54047335915012/AnsiballZ_systemd_service.py'
Oct 08 18:59:42 compute-0 sudo[104920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:59:43 compute-0 python3.9[104922]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 08 18:59:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:59:44.219 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 18:59:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:59:44.219 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 18:59:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:59:44.219 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 18:59:44 compute-0 sudo[104920]: pam_unix(sudo:session): session closed for user root
Oct 08 18:59:44 compute-0 sudo[105073]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgbaepwnteovckjmedtfusknujmuxpye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949984.4019454-1010-29231214104231/AnsiballZ_systemd_service.py'
Oct 08 18:59:44 compute-0 sudo[105073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:59:45 compute-0 python3.9[105075]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 08 18:59:45 compute-0 sudo[105073]: pam_unix(sudo:session): session closed for user root
Oct 08 18:59:45 compute-0 sudo[105226]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwjvykrqbtczlrmkbrjuuwehkmfclkqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949985.2805927-1010-273637241567328/AnsiballZ_systemd_service.py'
Oct 08 18:59:45 compute-0 sudo[105226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:59:45 compute-0 python3.9[105228]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 08 18:59:45 compute-0 sudo[105226]: pam_unix(sudo:session): session closed for user root
Oct 08 18:59:46 compute-0 sudo[105379]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtzaitggyxyixilqexujreurnhlmqcdc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949986.2089188-1069-95207314239701/AnsiballZ_file.py'
Oct 08 18:59:46 compute-0 sudo[105379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:59:46 compute-0 python3.9[105381]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:59:46 compute-0 sudo[105379]: pam_unix(sudo:session): session closed for user root
Oct 08 18:59:47 compute-0 sudo[105544]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzkvgkpnypnwtruikehsmduppmeearrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949986.8813257-1069-201899192675542/AnsiballZ_file.py'
Oct 08 18:59:47 compute-0 sudo[105544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:59:47 compute-0 podman[105505]: 2025-10-08 18:59:47.203473206 +0000 UTC m=+0.068956066 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_id=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 08 18:59:47 compute-0 python3.9[105551]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:59:47 compute-0 sudo[105544]: pam_unix(sudo:session): session closed for user root
Oct 08 18:59:47 compute-0 podman[105675]: 2025-10-08 18:59:47.892222541 +0000 UTC m=+0.069531302 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Oct 08 18:59:47 compute-0 sudo[105721]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lekansxhkccgcuacaqrnoyncmzvbznqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949987.5637589-1069-126276753743276/AnsiballZ_file.py'
Oct 08 18:59:47 compute-0 sudo[105721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:59:48 compute-0 python3.9[105723]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:59:48 compute-0 sudo[105721]: pam_unix(sudo:session): session closed for user root
Oct 08 18:59:48 compute-0 sudo[105873]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wytutloezmjenpyeyowsxyfmgewumnpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949988.248816-1069-18861620913581/AnsiballZ_file.py'
Oct 08 18:59:48 compute-0 sudo[105873]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:59:48 compute-0 python3.9[105875]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:59:48 compute-0 sudo[105873]: pam_unix(sudo:session): session closed for user root
Oct 08 18:59:49 compute-0 sudo[106038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lffdamnkmahpxvleisllwywocfajvedw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949988.8911572-1069-267781167835210/AnsiballZ_file.py'
Oct 08 18:59:49 compute-0 sudo[106038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:59:49 compute-0 podman[105999]: 2025-10-08 18:59:49.352766847 +0000 UTC m=+0.150364334 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct 08 18:59:49 compute-0 python3.9[106049]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:59:49 compute-0 sudo[106038]: pam_unix(sudo:session): session closed for user root
Oct 08 18:59:50 compute-0 sudo[106206]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mesidhftkjbejmdesaefqtyzdhbyzujk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949989.6500428-1069-25982207425950/AnsiballZ_file.py'
Oct 08 18:59:50 compute-0 sudo[106206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:59:50 compute-0 python3.9[106208]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:59:50 compute-0 sudo[106206]: pam_unix(sudo:session): session closed for user root
Oct 08 18:59:50 compute-0 sudo[106358]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kasmkfkvmicmzlewnuqsyoymmboyxuxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949990.4531052-1069-121473368772131/AnsiballZ_file.py'
Oct 08 18:59:50 compute-0 sudo[106358]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:59:50 compute-0 python3.9[106360]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:59:50 compute-0 sudo[106358]: pam_unix(sudo:session): session closed for user root
Oct 08 18:59:51 compute-0 sudo[106510]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zahhmontprwcsmbbqjwoajwjftaljtmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949991.1379063-1069-176366555804920/AnsiballZ_file.py'
Oct 08 18:59:51 compute-0 sudo[106510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:59:51 compute-0 python3.9[106512]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:59:51 compute-0 sudo[106510]: pam_unix(sudo:session): session closed for user root
Oct 08 18:59:52 compute-0 sudo[106662]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhathudeieimacvmvqnorpmeijxnqssk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949991.9463043-1126-108463972086194/AnsiballZ_file.py'
Oct 08 18:59:52 compute-0 sudo[106662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:59:52 compute-0 python3.9[106664]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:59:52 compute-0 sudo[106662]: pam_unix(sudo:session): session closed for user root
Oct 08 18:59:52 compute-0 sudo[106814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqfuqlwbitxvuphlwtgmciblsrxsogam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949992.627085-1126-202330429549244/AnsiballZ_file.py'
Oct 08 18:59:52 compute-0 sudo[106814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:59:53 compute-0 python3.9[106816]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:59:53 compute-0 sudo[106814]: pam_unix(sudo:session): session closed for user root
Oct 08 18:59:53 compute-0 sudo[106966]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdmiarlgrhkixwvozibwbnlqklfqsixe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949993.3455663-1126-278996216296314/AnsiballZ_file.py'
Oct 08 18:59:53 compute-0 sudo[106966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:59:53 compute-0 python3.9[106968]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:59:53 compute-0 sudo[106966]: pam_unix(sudo:session): session closed for user root
Oct 08 18:59:54 compute-0 sudo[107118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmgznyyfdzyhkcwdbxdpoawcmwvqvwwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949994.1094244-1126-124796356758198/AnsiballZ_file.py'
Oct 08 18:59:54 compute-0 sudo[107118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:59:54 compute-0 python3.9[107120]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:59:54 compute-0 sudo[107118]: pam_unix(sudo:session): session closed for user root
Oct 08 18:59:55 compute-0 sudo[107270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zaizhzfchnlsnlqltxgthbetrhyjkltz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949994.8363457-1126-263787470032000/AnsiballZ_file.py'
Oct 08 18:59:55 compute-0 sudo[107270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:59:55 compute-0 python3.9[107272]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:59:55 compute-0 sudo[107270]: pam_unix(sudo:session): session closed for user root
Oct 08 18:59:55 compute-0 sudo[107422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxihmlvfmodzorulfnjxwphkztvdznwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949995.566461-1126-183939369632197/AnsiballZ_file.py'
Oct 08 18:59:55 compute-0 sudo[107422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:59:56 compute-0 python3.9[107424]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:59:56 compute-0 sudo[107422]: pam_unix(sudo:session): session closed for user root
Oct 08 18:59:56 compute-0 sudo[107574]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oelsqgpfilhdeutazilfklaampdegrhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949996.264278-1126-117114442401224/AnsiballZ_file.py'
Oct 08 18:59:56 compute-0 sudo[107574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:59:56 compute-0 python3.9[107576]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:59:56 compute-0 sudo[107574]: pam_unix(sudo:session): session closed for user root
Oct 08 18:59:57 compute-0 sudo[107726]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jshrppwzskjipuluqpwkpxhkxchfszxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949997.0354695-1126-218070253447713/AnsiballZ_file.py'
Oct 08 18:59:57 compute-0 sudo[107726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:59:57 compute-0 python3.9[107728]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 18:59:57 compute-0 sudo[107726]: pam_unix(sudo:session): session closed for user root
Oct 08 18:59:58 compute-0 sudo[107878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hudoqodrpoialhsxfdvoldmkewjrxgrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949997.8420894-1184-108092308254202/AnsiballZ_command.py'
Oct 08 18:59:58 compute-0 sudo[107878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 18:59:58 compute-0 python3.9[107880]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 08 18:59:58 compute-0 sudo[107878]: pam_unix(sudo:session): session closed for user root
Oct 08 18:59:59 compute-0 python3.9[108032]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct 08 18:59:59 compute-0 sudo[108182]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vuofpnlowxybfiotcjomoubvhcksmxwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759949999.6602561-1202-92789405743873/AnsiballZ_systemd_service.py'
Oct 08 18:59:59 compute-0 sudo[108182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:00:00 compute-0 python3.9[108184]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 08 19:00:00 compute-0 systemd[1]: Starting system activity accounting tool...
Oct 08 19:00:00 compute-0 systemd[1]: Reloading.
Oct 08 19:00:00 compute-0 systemd-sysv-generator[108214]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 19:00:00 compute-0 systemd-rc-local-generator[108209]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 19:00:00 compute-0 systemd[1]: sysstat-collect.service: Deactivated successfully.
Oct 08 19:00:00 compute-0 systemd[1]: Finished system activity accounting tool.
Oct 08 19:00:00 compute-0 sudo[108182]: pam_unix(sudo:session): session closed for user root
Oct 08 19:00:01 compute-0 sudo[108371]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghgqetlartpghpuslvropywrovfseluw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950000.8384652-1210-262627332973748/AnsiballZ_command.py'
Oct 08 19:00:01 compute-0 sudo[108371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:00:01 compute-0 python3.9[108373]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 08 19:00:01 compute-0 sudo[108371]: pam_unix(sudo:session): session closed for user root
Oct 08 19:00:01 compute-0 sudo[108524]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eclkgevhdveesckncmbucodxssggfbzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950001.5493166-1210-2975510760985/AnsiballZ_command.py'
Oct 08 19:00:01 compute-0 sudo[108524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:00:02 compute-0 python3.9[108526]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 08 19:00:02 compute-0 sudo[108524]: pam_unix(sudo:session): session closed for user root
Oct 08 19:00:02 compute-0 sudo[108677]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggvkqjaguyzaqpfaybkfjxiherklpqez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950002.3703308-1210-274242407200719/AnsiballZ_command.py'
Oct 08 19:00:02 compute-0 sudo[108677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:00:02 compute-0 python3.9[108679]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 08 19:00:02 compute-0 sudo[108677]: pam_unix(sudo:session): session closed for user root
Oct 08 19:00:03 compute-0 sudo[108830]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oructqyttnaaxvpgeebamkknogbkzsva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950003.057483-1210-8453501181857/AnsiballZ_command.py'
Oct 08 19:00:03 compute-0 sudo[108830]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:00:03 compute-0 python3.9[108832]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 08 19:00:03 compute-0 sudo[108830]: pam_unix(sudo:session): session closed for user root
Oct 08 19:00:04 compute-0 sudo[108983]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhjxbabhawtrsnjbwvvuxmtvwujctiux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950003.8208354-1210-26139692948810/AnsiballZ_command.py'
Oct 08 19:00:04 compute-0 sudo[108983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:00:04 compute-0 python3.9[108985]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 08 19:00:04 compute-0 sudo[108983]: pam_unix(sudo:session): session closed for user root
Oct 08 19:00:04 compute-0 sudo[109136]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjsymzsmlmyxrkynrwwbivdxvpdfswzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950004.5818038-1210-95914144255926/AnsiballZ_command.py'
Oct 08 19:00:04 compute-0 sudo[109136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:00:05 compute-0 python3.9[109138]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 08 19:00:05 compute-0 sudo[109136]: pam_unix(sudo:session): session closed for user root
Oct 08 19:00:05 compute-0 sudo[109289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxjipeeqexfmrvdevesnqfyzwyhidrwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950005.2699635-1210-25919627516231/AnsiballZ_command.py'
Oct 08 19:00:05 compute-0 sudo[109289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:00:05 compute-0 python3.9[109291]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 08 19:00:05 compute-0 sudo[109289]: pam_unix(sudo:session): session closed for user root
Oct 08 19:00:06 compute-0 sudo[109442]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhhnbcglqbwmgwetbqvcpaqurlmyvlea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950006.0442054-1210-272258865533834/AnsiballZ_command.py'
Oct 08 19:00:06 compute-0 sudo[109442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:00:06 compute-0 python3.9[109444]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 08 19:00:06 compute-0 sudo[109442]: pam_unix(sudo:session): session closed for user root
Oct 08 19:00:07 compute-0 sudo[109595]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhtbxrguceenovifutldkeaivizhonqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950007.5125499-1289-214925377973434/AnsiballZ_file.py'
Oct 08 19:00:07 compute-0 sudo[109595]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:00:08 compute-0 python3.9[109597]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 08 19:00:08 compute-0 sudo[109595]: pam_unix(sudo:session): session closed for user root
Oct 08 19:00:08 compute-0 sudo[109747]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcicfbzdhsyrviplopevabedfkkdjggg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950008.3529384-1289-15745342790821/AnsiballZ_file.py'
Oct 08 19:00:08 compute-0 sudo[109747]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:00:08 compute-0 python3.9[109749]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 08 19:00:08 compute-0 sudo[109747]: pam_unix(sudo:session): session closed for user root
Oct 08 19:00:09 compute-0 sudo[109899]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvlvdpyvpvvkfwsmetnhofeqmossnnhp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950009.0853834-1289-156390093081733/AnsiballZ_file.py'
Oct 08 19:00:09 compute-0 sudo[109899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:00:09 compute-0 python3.9[109901]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 08 19:00:09 compute-0 sudo[109899]: pam_unix(sudo:session): session closed for user root
Oct 08 19:00:10 compute-0 sudo[110051]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhnlovtfpftohakzovxacudwrckcwwst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950010.0409665-1311-109420607385858/AnsiballZ_file.py'
Oct 08 19:00:10 compute-0 sudo[110051]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:00:10 compute-0 python3.9[110053]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 08 19:00:10 compute-0 sudo[110051]: pam_unix(sudo:session): session closed for user root
Oct 08 19:00:11 compute-0 sudo[110203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygbrekrvcdazidpzseslzlukitrofpgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950010.7934-1311-142679336922110/AnsiballZ_file.py'
Oct 08 19:00:11 compute-0 sudo[110203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:00:11 compute-0 python3.9[110205]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 08 19:00:11 compute-0 sudo[110203]: pam_unix(sudo:session): session closed for user root
Oct 08 19:00:11 compute-0 podman[110253]: 2025-10-08 19:00:11.706285812 +0000 UTC m=+0.110991354 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 08 19:00:11 compute-0 sudo[110372]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpsqvfgerqhypwscccrynwbdyfxdaddu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950011.5392094-1311-53533371497462/AnsiballZ_file.py'
Oct 08 19:00:11 compute-0 sudo[110372]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:00:12 compute-0 python3.9[110374]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 08 19:00:12 compute-0 sudo[110372]: pam_unix(sudo:session): session closed for user root
Oct 08 19:00:12 compute-0 sudo[110524]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwcikiopijjubzopbqeatbvjfhlvmomd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950012.348732-1311-165579465614549/AnsiballZ_file.py'
Oct 08 19:00:12 compute-0 sudo[110524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:00:12 compute-0 python3.9[110526]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 08 19:00:12 compute-0 sudo[110524]: pam_unix(sudo:session): session closed for user root
Oct 08 19:00:13 compute-0 sudo[110676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqpratoechqlavywztmxyanesownjsek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950013.1345527-1311-268039654828405/AnsiballZ_file.py'
Oct 08 19:00:13 compute-0 sudo[110676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:00:13 compute-0 python3.9[110678]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 08 19:00:13 compute-0 sudo[110676]: pam_unix(sudo:session): session closed for user root
Oct 08 19:00:14 compute-0 sudo[110828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bserrjgzhhccjxwjkmgjkqvgqdyeqwky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950013.913936-1311-11362151224419/AnsiballZ_file.py'
Oct 08 19:00:14 compute-0 sudo[110828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:00:14 compute-0 python3.9[110830]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 08 19:00:14 compute-0 sudo[110828]: pam_unix(sudo:session): session closed for user root
Oct 08 19:00:15 compute-0 sudo[110981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nljbfufqzigdaevvaxlycfhpdydqsvpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950014.6511803-1311-115963774253/AnsiballZ_file.py'
Oct 08 19:00:15 compute-0 sudo[110981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:00:15 compute-0 python3.9[110983]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 08 19:00:15 compute-0 sudo[110981]: pam_unix(sudo:session): session closed for user root
Oct 08 19:00:15 compute-0 sudo[111133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlfhhqmobvlqzvwgfcmmlwtvqpvxnwrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950015.4780986-1311-64675816703592/AnsiballZ_file.py'
Oct 08 19:00:15 compute-0 sudo[111133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:00:16 compute-0 python3.9[111135]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 08 19:00:16 compute-0 sudo[111133]: pam_unix(sudo:session): session closed for user root
Oct 08 19:00:16 compute-0 sudo[111285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uypiyhwdwbaiosyqloeoifycpafdefdm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950016.2397301-1311-251718534144424/AnsiballZ_file.py'
Oct 08 19:00:16 compute-0 sudo[111285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:00:16 compute-0 python3.9[111287]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 08 19:00:16 compute-0 sudo[111285]: pam_unix(sudo:session): session closed for user root
Oct 08 19:00:17 compute-0 podman[111312]: 2025-10-08 19:00:17.666307067 +0000 UTC m=+0.082398001 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=iscsid)
Oct 08 19:00:18 compute-0 podman[111332]: 2025-10-08 19:00:18.65623814 +0000 UTC m=+0.074176834 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct 08 19:00:19 compute-0 podman[111352]: 2025-10-08 19:00:19.706468348 +0000 UTC m=+0.134181661 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Oct 08 19:00:21 compute-0 sudo[111504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szhbkyhfgmoqpavejncvhrsfdjxorlor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950021.0177248-1494-28832404352645/AnsiballZ_getent.py'
Oct 08 19:00:21 compute-0 sudo[111504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:00:21 compute-0 python3.9[111506]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Oct 08 19:00:21 compute-0 sudo[111504]: pam_unix(sudo:session): session closed for user root
Oct 08 19:00:22 compute-0 sudo[111657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpdsuhujhzblujisksagctiotckfabjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950021.9121063-1502-31605405373894/AnsiballZ_group.py'
Oct 08 19:00:22 compute-0 sudo[111657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:00:22 compute-0 python3.9[111659]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct 08 19:00:22 compute-0 groupadd[111660]: group added to /etc/group: name=nova, GID=42436
Oct 08 19:00:22 compute-0 groupadd[111660]: group added to /etc/gshadow: name=nova
Oct 08 19:00:22 compute-0 groupadd[111660]: new group: name=nova, GID=42436
Oct 08 19:00:22 compute-0 sudo[111657]: pam_unix(sudo:session): session closed for user root
Oct 08 19:00:23 compute-0 sudo[111815]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfszqfuchttpikakauevdrcnezrbacbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950022.9300783-1510-27444972331663/AnsiballZ_user.py'
Oct 08 19:00:23 compute-0 sudo[111815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:00:23 compute-0 python3.9[111817]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct 08 19:00:23 compute-0 useradd[111819]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Oct 08 19:00:23 compute-0 useradd[111819]: add 'nova' to group 'libvirt'
Oct 08 19:00:23 compute-0 useradd[111819]: add 'nova' to shadow group 'libvirt'
Oct 08 19:00:23 compute-0 sudo[111815]: pam_unix(sudo:session): session closed for user root
Oct 08 19:00:24 compute-0 sshd-session[111850]: Accepted publickey for zuul from 192.168.122.30 port 59910 ssh2: ECDSA SHA256:i+73Mx2Y/ukt1b+huf+9w+ftZalnyybbDU6glTR0JfU
Oct 08 19:00:24 compute-0 systemd-logind[844]: New session 10 of user zuul.
Oct 08 19:00:24 compute-0 systemd[1]: Started Session 10 of User zuul.
Oct 08 19:00:24 compute-0 sshd-session[111850]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 08 19:00:24 compute-0 sshd-session[111853]: Received disconnect from 192.168.122.30 port 59910:11: disconnected by user
Oct 08 19:00:24 compute-0 sshd-session[111853]: Disconnected from user zuul 192.168.122.30 port 59910
Oct 08 19:00:24 compute-0 sshd-session[111850]: pam_unix(sshd:session): session closed for user zuul
Oct 08 19:00:24 compute-0 systemd[1]: session-10.scope: Deactivated successfully.
Oct 08 19:00:24 compute-0 systemd-logind[844]: Session 10 logged out. Waiting for processes to exit.
Oct 08 19:00:24 compute-0 systemd-logind[844]: Removed session 10.
Oct 08 19:00:25 compute-0 python3.9[112003]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 19:00:26 compute-0 python3.9[112124]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759950025.0368984-1535-261634882125252/.source.json follow=False _original_basename=config.json.j2 checksum=2c2474b5f24ef7c9ed37f49680082593e0d1100b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 08 19:00:27 compute-0 python3.9[112274]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 19:00:27 compute-0 python3.9[112350]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 08 19:00:28 compute-0 python3.9[112500]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 19:00:28 compute-0 python3.9[112621]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759950027.6656327-1535-207035824011674/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 08 19:00:29 compute-0 python3.9[112771]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 19:00:30 compute-0 python3.9[112894]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759950028.9868026-1535-15403618273586/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=1feba546d0beacad9258164ab79b8a747685ccc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 08 19:00:30 compute-0 unix_chkpwd[112895]: password check failed for user (root)
Oct 08 19:00:30 compute-0 sshd-session[112772]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=78.128.112.74  user=root
Oct 08 19:00:30 compute-0 python3.9[113045]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 19:00:31 compute-0 python3.9[113166]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759950030.289836-1535-192588183002005/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 08 19:00:32 compute-0 sudo[113316]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lyxngpreqlowkkxectpczkzyvmplnoto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950031.6571858-1604-48567953947243/AnsiballZ_file.py'
Oct 08 19:00:32 compute-0 sudo[113316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:00:32 compute-0 python3.9[113318]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:00:32 compute-0 sudo[113316]: pam_unix(sudo:session): session closed for user root
Oct 08 19:00:32 compute-0 sshd-session[112772]: Failed password for root from 78.128.112.74 port 46162 ssh2
Oct 08 19:00:32 compute-0 sshd-session[112772]: Connection closed by authenticating user root 78.128.112.74 port 46162 [preauth]
Oct 08 19:00:32 compute-0 sudo[113468]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdqljgwtqzyqssefxbsczyilrzlfptlc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950032.4249263-1612-215678574199020/AnsiballZ_copy.py'
Oct 08 19:00:32 compute-0 sudo[113468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:00:32 compute-0 python3.9[113470]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:00:33 compute-0 sudo[113468]: pam_unix(sudo:session): session closed for user root
Oct 08 19:00:33 compute-0 sudo[113620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwmamavrtqaofljjimtxknukoryqlfdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950033.1870978-1620-85449727038816/AnsiballZ_stat.py'
Oct 08 19:00:33 compute-0 sudo[113620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:00:33 compute-0 python3.9[113622]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 08 19:00:33 compute-0 sudo[113620]: pam_unix(sudo:session): session closed for user root
Oct 08 19:00:34 compute-0 sudo[113772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kymbzzxomxzjjqckwevnpgikkujixlcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950033.9499967-1628-253820750042259/AnsiballZ_stat.py'
Oct 08 19:00:34 compute-0 sudo[113772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:00:34 compute-0 python3.9[113774]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 19:00:34 compute-0 sudo[113772]: pam_unix(sudo:session): session closed for user root
Oct 08 19:00:34 compute-0 sudo[113895]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnnuvxwsqahdkyccxihftcsdacluvrqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950033.9499967-1628-253820750042259/AnsiballZ_copy.py'
Oct 08 19:00:34 compute-0 sudo[113895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:00:35 compute-0 python3.9[113897]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1759950033.9499967-1628-253820750042259/.source _original_basename=.0k0akyhg follow=False checksum=08a0b89788bacf538e34b1b4a6eecea1c595c768 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Oct 08 19:00:35 compute-0 sudo[113895]: pam_unix(sudo:session): session closed for user root
Oct 08 19:00:35 compute-0 python3.9[114049]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 08 19:00:36 compute-0 python3.9[114201]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 19:00:37 compute-0 python3.9[114322]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759950036.2191405-1654-23521037203750/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=837ffd9c004e5987a2e117698c56827ebbfeb5b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 08 19:00:38 compute-0 python3.9[114472]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 19:00:38 compute-0 python3.9[114593]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759950037.6038756-1669-204846307884455/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=722ab36345f3375cbdcf911ce8f6e1a8083d7e59 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 08 19:00:39 compute-0 sudo[114743]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-keunfopzxalelkalchxanwsfelbidtdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950039.308519-1686-272037462786160/AnsiballZ_container_config_data.py'
Oct 08 19:00:39 compute-0 sudo[114743]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:00:39 compute-0 python3.9[114745]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Oct 08 19:00:39 compute-0 sudo[114743]: pam_unix(sudo:session): session closed for user root
Oct 08 19:00:40 compute-0 sudo[114895]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqynotdpvjpnhvdvhabnhbjbtbbxsyul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950040.100076-1695-132612027799428/AnsiballZ_container_config_hash.py'
Oct 08 19:00:40 compute-0 sudo[114895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:00:40 compute-0 python3.9[114897]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 08 19:00:40 compute-0 sudo[114895]: pam_unix(sudo:session): session closed for user root
Oct 08 19:00:41 compute-0 sudo[115047]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypuikjqzmgfgekelkjvpqywfxbjxsgtf ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759950040.9249089-1705-24964256833686/AnsiballZ_edpm_container_manage.py'
Oct 08 19:00:41 compute-0 sudo[115047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:00:41 compute-0 python3[115049]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Oct 08 19:00:41 compute-0 podman[115086]: 2025-10-08 19:00:41.720000524 +0000 UTC m=+0.071898660 container create a5a596648e1d0ab8a84f48da9bc406d3b2c789976c4340d1386b8a7ace66d133 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844, name=nova_compute_init, container_name=nova_compute_init, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=edpm)
Oct 08 19:00:41 compute-0 podman[115086]: 2025-10-08 19:00:41.678222122 +0000 UTC m=+0.030120308 image pull 7ac362f4e836cf46e10a309acb4abf774df9481a1d6404c213437495cfb42f5d quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844
Oct 08 19:00:41 compute-0 python3[115049]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844 bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Oct 08 19:00:41 compute-0 sudo[115047]: pam_unix(sudo:session): session closed for user root
Oct 08 19:00:42 compute-0 sudo[115292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbovgdgnaowswdcahbjicrsgdwalvofz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950042.08688-1713-176890393933935/AnsiballZ_stat.py'
Oct 08 19:00:42 compute-0 sudo[115292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:00:42 compute-0 podman[115248]: 2025-10-08 19:00:42.474281179 +0000 UTC m=+0.088208858 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct 08 19:00:42 compute-0 python3.9[115297]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 08 19:00:42 compute-0 sudo[115292]: pam_unix(sudo:session): session closed for user root
Oct 08 19:00:43 compute-0 sudo[115449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxyqdjlerhaevbrlaiylbbhdynhfqiik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950043.1117525-1725-62313959461801/AnsiballZ_container_config_data.py'
Oct 08 19:00:43 compute-0 sudo[115449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:00:43 compute-0 python3.9[115451]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Oct 08 19:00:43 compute-0 sudo[115449]: pam_unix(sudo:session): session closed for user root
Oct 08 19:00:44 compute-0 sudo[115601]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bcsjirnarqxzbedfbukdzxhgyjypksgp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950043.9454927-1734-37571893955336/AnsiballZ_container_config_hash.py'
Oct 08 19:00:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:00:44.220 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:00:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:00:44.221 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:00:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:00:44.221 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:00:44 compute-0 sudo[115601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:00:44 compute-0 python3.9[115603]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 08 19:00:44 compute-0 sudo[115601]: pam_unix(sudo:session): session closed for user root
Oct 08 19:00:45 compute-0 sudo[115753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxcmzxfakgjdrdnlzawpkynbqjypkwbv ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759950044.7423167-1744-51846336415223/AnsiballZ_edpm_container_manage.py'
Oct 08 19:00:45 compute-0 sudo[115753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:00:45 compute-0 python3[115755]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Oct 08 19:00:45 compute-0 podman[115793]: 2025-10-08 19:00:45.571118381 +0000 UTC m=+0.062236861 container create e51f777915e1b029705fc377b3a37e2f0df1eae4dc423f4a6608ea43f1cd0baf (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844, name=nova_compute, config_id=edpm, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251001, container_name=nova_compute, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Oct 08 19:00:45 compute-0 podman[115793]: 2025-10-08 19:00:45.532721687 +0000 UTC m=+0.023840187 image pull 7ac362f4e836cf46e10a309acb4abf774df9481a1d6404c213437495cfb42f5d quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844
Oct 08 19:00:45 compute-0 python3[115755]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844 kolla_start
Oct 08 19:00:45 compute-0 sudo[115753]: pam_unix(sudo:session): session closed for user root
Oct 08 19:00:46 compute-0 sudo[115981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldpeqvvvnwizwmnfhcngshhgqnkiraiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950045.9658399-1752-172612365588306/AnsiballZ_stat.py'
Oct 08 19:00:46 compute-0 sudo[115981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:00:46 compute-0 python3.9[115983]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 08 19:00:46 compute-0 sudo[115981]: pam_unix(sudo:session): session closed for user root
Oct 08 19:00:47 compute-0 sudo[116137]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-crpjajulkjvctyqiiattmtgzwjulgffz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950046.7996306-1761-29154166674567/AnsiballZ_file.py'
Oct 08 19:00:47 compute-0 sudo[116137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:00:47 compute-0 python3.9[116139]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:00:47 compute-0 sudo[116137]: pam_unix(sudo:session): session closed for user root
Oct 08 19:00:47 compute-0 unix_chkpwd[116163]: password check failed for user (root)
Oct 08 19:00:47 compute-0 sshd-session[116055]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.159  user=root
Oct 08 19:00:48 compute-0 sudo[116300]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrchypmfiaipzulmgsxkpohjhlxsuciv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950047.6049562-1761-157117261782955/AnsiballZ_copy.py'
Oct 08 19:00:48 compute-0 sudo[116300]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:00:48 compute-0 podman[116263]: 2025-10-08 19:00:48.108741261 +0000 UTC m=+0.070666154 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=iscsid, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true)
Oct 08 19:00:48 compute-0 python3.9[116309]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759950047.6049562-1761-157117261782955/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:00:48 compute-0 sudo[116300]: pam_unix(sudo:session): session closed for user root
Oct 08 19:00:48 compute-0 sudo[116383]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmrisadiuiguwnltiqgzlmzptzjbxwaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950047.6049562-1761-157117261782955/AnsiballZ_systemd.py'
Oct 08 19:00:48 compute-0 sudo[116383]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:00:48 compute-0 python3.9[116385]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 08 19:00:48 compute-0 systemd[1]: Reloading.
Oct 08 19:00:49 compute-0 systemd-sysv-generator[116425]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 19:00:49 compute-0 systemd-rc-local-generator[116421]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 19:00:49 compute-0 podman[116387]: 2025-10-08 19:00:49.025850118 +0000 UTC m=+0.094642803 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 08 19:00:49 compute-0 sudo[116383]: pam_unix(sudo:session): session closed for user root
Oct 08 19:00:49 compute-0 sudo[116513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfwraxruqpwrzqhhenxuyxfzaozibpta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950047.6049562-1761-157117261782955/AnsiballZ_systemd.py'
Oct 08 19:00:49 compute-0 sudo[116513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:00:49 compute-0 python3.9[116515]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 08 19:00:49 compute-0 systemd[1]: Reloading.
Oct 08 19:00:49 compute-0 sshd-session[116055]: Failed password for root from 193.46.255.159 port 63470 ssh2
Oct 08 19:00:49 compute-0 podman[116517]: 2025-10-08 19:00:49.96216491 +0000 UTC m=+0.104921609 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct 08 19:00:50 compute-0 systemd-rc-local-generator[116574]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 19:00:50 compute-0 systemd-sysv-generator[116579]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 19:00:50 compute-0 unix_chkpwd[116581]: password check failed for user (root)
Oct 08 19:00:50 compute-0 systemd[1]: Starting nova_compute container...
Oct 08 19:00:50 compute-0 systemd[1]: Started libcrun container.
Oct 08 19:00:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d7ea2da9d416b58111d04fa77b29f528729da1bcd33978a81b2a5bf8cb63050d/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Oct 08 19:00:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d7ea2da9d416b58111d04fa77b29f528729da1bcd33978a81b2a5bf8cb63050d/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct 08 19:00:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d7ea2da9d416b58111d04fa77b29f528729da1bcd33978a81b2a5bf8cb63050d/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct 08 19:00:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d7ea2da9d416b58111d04fa77b29f528729da1bcd33978a81b2a5bf8cb63050d/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct 08 19:00:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d7ea2da9d416b58111d04fa77b29f528729da1bcd33978a81b2a5bf8cb63050d/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Oct 08 19:00:50 compute-0 podman[116584]: 2025-10-08 19:00:50.408354283 +0000 UTC m=+0.175391806 container init e51f777915e1b029705fc377b3a37e2f0df1eae4dc423f4a6608ea43f1cd0baf (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844, name=nova_compute, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.schema-version=1.0, config_id=edpm, managed_by=edpm_ansible)
Oct 08 19:00:50 compute-0 podman[116584]: 2025-10-08 19:00:50.414709856 +0000 UTC m=+0.181747359 container start e51f777915e1b029705fc377b3a37e2f0df1eae4dc423f4a6608ea43f1cd0baf (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844, name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible)
Oct 08 19:00:50 compute-0 podman[116584]: nova_compute
Oct 08 19:00:50 compute-0 systemd[1]: Started nova_compute container.
Oct 08 19:00:50 compute-0 nova_compute[116600]: + sudo -E kolla_set_configs
Oct 08 19:00:50 compute-0 sudo[116513]: pam_unix(sudo:session): session closed for user root
Oct 08 19:00:50 compute-0 nova_compute[116600]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 08 19:00:50 compute-0 nova_compute[116600]: INFO:__main__:Validating config file
Oct 08 19:00:50 compute-0 nova_compute[116600]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 08 19:00:50 compute-0 nova_compute[116600]: INFO:__main__:Copying service configuration files
Oct 08 19:00:50 compute-0 nova_compute[116600]: INFO:__main__:Deleting /etc/nova/nova.conf
Oct 08 19:00:50 compute-0 nova_compute[116600]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Oct 08 19:00:50 compute-0 nova_compute[116600]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Oct 08 19:00:50 compute-0 nova_compute[116600]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Oct 08 19:00:50 compute-0 nova_compute[116600]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Oct 08 19:00:50 compute-0 nova_compute[116600]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Oct 08 19:00:50 compute-0 nova_compute[116600]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Oct 08 19:00:50 compute-0 nova_compute[116600]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Oct 08 19:00:50 compute-0 nova_compute[116600]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Oct 08 19:00:50 compute-0 nova_compute[116600]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct 08 19:00:50 compute-0 nova_compute[116600]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct 08 19:00:50 compute-0 nova_compute[116600]: INFO:__main__:Deleting /etc/ceph
Oct 08 19:00:50 compute-0 nova_compute[116600]: INFO:__main__:Creating directory /etc/ceph
Oct 08 19:00:50 compute-0 nova_compute[116600]: INFO:__main__:Setting permission for /etc/ceph
Oct 08 19:00:50 compute-0 nova_compute[116600]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Oct 08 19:00:50 compute-0 nova_compute[116600]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct 08 19:00:50 compute-0 nova_compute[116600]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Oct 08 19:00:50 compute-0 nova_compute[116600]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct 08 19:00:50 compute-0 nova_compute[116600]: INFO:__main__:Writing out command to execute
Oct 08 19:00:50 compute-0 nova_compute[116600]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Oct 08 19:00:50 compute-0 nova_compute[116600]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct 08 19:00:50 compute-0 nova_compute[116600]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct 08 19:00:50 compute-0 nova_compute[116600]: ++ cat /run_command
Oct 08 19:00:50 compute-0 nova_compute[116600]: + CMD=nova-compute
Oct 08 19:00:50 compute-0 nova_compute[116600]: + ARGS=
Oct 08 19:00:50 compute-0 nova_compute[116600]: + sudo kolla_copy_cacerts
Oct 08 19:00:50 compute-0 nova_compute[116600]: + [[ ! -n '' ]]
Oct 08 19:00:50 compute-0 nova_compute[116600]: + . kolla_extend_start
Oct 08 19:00:50 compute-0 nova_compute[116600]: Running command: 'nova-compute'
Oct 08 19:00:50 compute-0 nova_compute[116600]: + echo 'Running command: '\''nova-compute'\'''
Oct 08 19:00:50 compute-0 nova_compute[116600]: + umask 0022
Oct 08 19:00:50 compute-0 nova_compute[116600]: + exec nova-compute
Oct 08 19:00:51 compute-0 python3.9[116761]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 08 19:00:51 compute-0 sshd-session[116055]: Failed password for root from 193.46.255.159 port 63470 ssh2
Oct 08 19:00:52 compute-0 python3.9[116912]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 08 19:00:52 compute-0 unix_chkpwd[117001]: password check failed for user (root)
Oct 08 19:00:52 compute-0 python3.9[117063]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 08 19:00:53 compute-0 nova_compute[116600]: 2025-10-08 19:00:53.369 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Oct 08 19:00:53 compute-0 nova_compute[116600]: 2025-10-08 19:00:53.369 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Oct 08 19:00:53 compute-0 nova_compute[116600]: 2025-10-08 19:00:53.369 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Oct 08 19:00:53 compute-0 nova_compute[116600]: 2025-10-08 19:00:53.369 2 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Oct 08 19:00:53 compute-0 sudo[117215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgpewfjffvyamwrjgffrrwsiqneumyck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950053.1528811-1821-106445396891359/AnsiballZ_podman_container.py'
Oct 08 19:00:53 compute-0 sudo[117215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:00:53 compute-0 nova_compute[116600]: 2025-10-08 19:00:53.580 2 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:00:53 compute-0 nova_compute[116600]: 2025-10-08 19:00:53.617 2 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:00:53 compute-0 python3.9[117217]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Oct 08 19:00:53 compute-0 sudo[117215]: pam_unix(sudo:session): session closed for user root
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.187 2 INFO nova.virt.driver [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Oct 08 19:00:54 compute-0 rsyslogd[1288]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 08 19:00:54 compute-0 rsyslogd[1288]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 08 19:00:54 compute-0 rsyslogd[1288]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 08 19:00:54 compute-0 sshd-session[116055]: Failed password for root from 193.46.255.159 port 63470 ssh2
Oct 08 19:00:54 compute-0 sudo[117390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbdhctdgpqbaxsaebfkxhiemygbmuast ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950054.0176356-1829-271949354727571/AnsiballZ_systemd.py'
Oct 08 19:00:54 compute-0 sudo[117390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.498 2 INFO nova.compute.provider_config [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.513 2 DEBUG oslo_concurrency.lockutils [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.514 2 DEBUG oslo_concurrency.lockutils [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.514 2 DEBUG oslo_concurrency.lockutils [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.514 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.515 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.515 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.515 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.515 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.515 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.515 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.516 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.516 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.516 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.516 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.516 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.517 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.517 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.517 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.517 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.517 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.518 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.518 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.518 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.518 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.518 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.519 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.519 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.519 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.519 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.519 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.520 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.520 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.520 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.520 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.520 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.521 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.521 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.521 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.521 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.521 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.522 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.522 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.522 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.522 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.522 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.523 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.523 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.523 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.523 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.523 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.524 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.524 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.524 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.524 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.524 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.525 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.525 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.525 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.525 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.525 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.526 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.526 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.526 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.526 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.526 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.527 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.527 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.527 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.527 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.527 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.527 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.528 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.528 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.528 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.528 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.528 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.529 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.529 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.529 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.529 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.529 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.530 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.530 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.530 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.530 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.530 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.531 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.531 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.531 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.531 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.531 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.532 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.532 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.532 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.532 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.532 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.533 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.533 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.533 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.533 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.533 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.533 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.534 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.534 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.534 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.534 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.534 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.535 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.535 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.535 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.535 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.535 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.535 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.536 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.536 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.536 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.536 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.536 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.536 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.537 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.537 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.537 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.537 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.537 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.538 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.538 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.538 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.538 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.538 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.539 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.539 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.539 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.539 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.539 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.539 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.540 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.540 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.540 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.540 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.540 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.541 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.541 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.541 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.541 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.541 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.542 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.542 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.542 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.542 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.542 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.543 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.543 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.543 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.543 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.543 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.544 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.544 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.544 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.544 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.544 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.545 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.545 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.545 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.545 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.545 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.546 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.546 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.546 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.546 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.546 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.547 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.547 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.547 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.547 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.547 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.547 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.548 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.548 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.548 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.548 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.549 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.549 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.549 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.549 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.549 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.550 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.550 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.550 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.550 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.550 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.550 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.551 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.551 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.551 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.551 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.551 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.552 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.552 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.552 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.552 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.552 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.553 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.553 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.553 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.553 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.553 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.554 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.554 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.554 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.554 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.554 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.555 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.555 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.555 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.555 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.555 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.556 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.556 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.556 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.556 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.556 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.557 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.557 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.557 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.557 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.557 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.557 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.558 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.558 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.558 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.558 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.558 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.559 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.559 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.559 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.559 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.559 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.560 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.560 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.560 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.560 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.560 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.561 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.561 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.561 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.561 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.561 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.562 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.562 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.562 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.562 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.562 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.562 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.563 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.563 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.563 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.563 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.563 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.564 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.564 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.564 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.564 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.564 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.565 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.565 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.565 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.565 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.565 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.566 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.566 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.566 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.566 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.566 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.567 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.567 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.567 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.567 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.567 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.568 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.568 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.568 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.568 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.568 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.568 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.569 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.569 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.569 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.569 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.569 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.570 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.570 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.570 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.570 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.570 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.571 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.571 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.571 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.571 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.571 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.572 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.572 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.572 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.572 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.572 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.573 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.573 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.573 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.573 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.573 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.573 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.574 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.574 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.574 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.574 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.574 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.575 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.575 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.575 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.575 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.575 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.576 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.576 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.576 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.576 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.576 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.577 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.577 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.577 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.577 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.577 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.577 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.578 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.578 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.578 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.578 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.578 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.579 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.579 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.579 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.579 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.579 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.580 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.580 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.580 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.580 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.580 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.581 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.581 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.581 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.581 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.581 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.582 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.582 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.582 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.582 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.582 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.582 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.583 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.583 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.583 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.583 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.584 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.584 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.584 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.584 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.584 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.585 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.585 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.585 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.585 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.585 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.586 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.586 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.586 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.586 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.586 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.586 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.587 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.587 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.587 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.587 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.587 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.588 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.588 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.588 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.588 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.588 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.589 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.589 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.589 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.589 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.589 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.590 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.590 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.590 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.590 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.590 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.591 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.591 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.591 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.591 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.591 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.591 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.592 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.592 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.592 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.592 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.592 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.593 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.593 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.593 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.593 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.593 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.594 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.594 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.594 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.594 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.594 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.595 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.595 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.595 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.595 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.595 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.595 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.596 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.596 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.596 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.596 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.596 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.597 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.597 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.597 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.597 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.597 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.598 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.598 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.598 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.598 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.598 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.599 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.599 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.599 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.599 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.599 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.599 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.600 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.600 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.600 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.600 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.600 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.601 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.601 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.601 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.601 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.601 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.602 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.602 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.602 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.602 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.602 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.603 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.603 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.603 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.603 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.603 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.604 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.604 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.604 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.604 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.604 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.605 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.605 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.605 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.605 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.606 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.606 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.606 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.606 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.606 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.607 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.607 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.607 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.607 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.607 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.608 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.608 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.608 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.608 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.608 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.609 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.609 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.609 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.609 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.609 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.610 2 WARNING oslo_config.cfg [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Oct 08 19:00:54 compute-0 nova_compute[116600]: live_migration_uri is deprecated for removal in favor of two other options that
Oct 08 19:00:54 compute-0 nova_compute[116600]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Oct 08 19:00:54 compute-0 nova_compute[116600]: and ``live_migration_inbound_addr`` respectively.
Oct 08 19:00:54 compute-0 nova_compute[116600]: ).  Its value may be silently ignored in the future.
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.610 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.610 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.610 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.611 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.611 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.611 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.611 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.612 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.612 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.612 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.612 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.612 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.612 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.613 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.613 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.613 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.613 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.614 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.614 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.614 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.614 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.614 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.615 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.615 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.615 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.615 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.615 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.616 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.616 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.616 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.616 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.616 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.617 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.617 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.617 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.617 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.617 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.618 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.618 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.618 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.618 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.618 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.619 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.619 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.619 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.619 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.619 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.620 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.620 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.620 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.620 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.620 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.621 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.621 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.621 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.621 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.621 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.622 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.622 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.622 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.622 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.622 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.623 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.623 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.623 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.623 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.623 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.624 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.624 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.624 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.624 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.624 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.625 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.625 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.625 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.625 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.626 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.626 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.626 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.626 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.626 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.627 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.627 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.627 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.627 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.627 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.628 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.628 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.628 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.628 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.628 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.629 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.629 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.629 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.629 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.629 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.630 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.630 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.630 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.630 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.630 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.631 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.631 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.631 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.631 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.631 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.632 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.632 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.632 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.632 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.632 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.633 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.633 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.633 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.633 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.633 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.634 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.634 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.634 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.634 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.634 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.635 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.635 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.635 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.635 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.636 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.636 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.636 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.636 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.636 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.636 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.637 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.637 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.637 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.637 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.638 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.638 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.638 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.638 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.639 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.639 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.639 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.639 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.639 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.640 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.640 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.640 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.640 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.640 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.641 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.641 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.641 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.641 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.641 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.642 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.642 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.642 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.642 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.642 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.643 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.643 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.643 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.643 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.643 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.644 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.644 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.644 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.644 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.644 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.645 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.645 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.645 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.645 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.645 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.646 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.646 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.646 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.646 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.647 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.647 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.647 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.647 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.647 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.648 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.648 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.648 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.648 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.648 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.649 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.649 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.649 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.649 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.649 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.650 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.650 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.650 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.650 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.651 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.651 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.651 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.651 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.651 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.652 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.652 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.652 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.652 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.652 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.653 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.653 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.653 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.653 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.653 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.654 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.654 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.654 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.654 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.654 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.655 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.655 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.655 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.655 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.655 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.656 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.656 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.656 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.656 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.656 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.657 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.657 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.657 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.657 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.657 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.658 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.658 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.658 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.658 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.658 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.659 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.659 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.659 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.659 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.659 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.660 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.660 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.660 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.660 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.660 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.661 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.661 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.661 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.661 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.662 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.662 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.662 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.662 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.662 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.663 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.663 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.663 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.663 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.663 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.664 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.664 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.664 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.664 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.664 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.665 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.665 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.665 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.665 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.665 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.666 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.666 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.666 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.666 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.666 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.667 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.667 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.667 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.667 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.667 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.668 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.668 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.668 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.668 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.668 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.668 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.669 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.669 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.669 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.669 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.670 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.670 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.670 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.670 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.670 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.671 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.671 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.671 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.671 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.671 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.672 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.672 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.672 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.672 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.672 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.673 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.673 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.673 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.673 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.673 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.674 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.674 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.674 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.674 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.674 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.675 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.675 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.675 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.675 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.675 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.676 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.676 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.676 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.676 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.676 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.677 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.677 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.677 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.677 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.677 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.678 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.678 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.678 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.678 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.678 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.679 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.679 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.679 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.679 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.679 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.680 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.680 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.680 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.680 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.680 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.681 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.681 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.681 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.681 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.681 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.681 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.682 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.682 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.682 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.682 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.682 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.683 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.683 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.683 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.683 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.683 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.684 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.684 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.684 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.684 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.684 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.685 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.685 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.685 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.685 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.685 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.686 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.686 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.686 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.686 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.686 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.687 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.687 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.687 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.687 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.687 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.688 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.688 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.688 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.688 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.688 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.688 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.689 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.689 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.689 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.689 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.689 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.690 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.690 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.690 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.690 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.690 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.691 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.691 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.691 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.691 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.691 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.692 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.692 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.692 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.692 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.692 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.693 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.693 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.693 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.693 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.693 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.694 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.694 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.694 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.694 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.694 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.695 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.695 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.695 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.695 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.695 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.695 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Oct 08 19:00:54 compute-0 python3.9[117392]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.701 2 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.714 2 DEBUG nova.virt.libvirt.host [None req-8d283617-3907-4613-a076-7af437b7681f - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.714 2 DEBUG nova.virt.libvirt.host [None req-8d283617-3907-4613-a076-7af437b7681f - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.715 2 DEBUG nova.virt.libvirt.host [None req-8d283617-3907-4613-a076-7af437b7681f - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.715 2 DEBUG nova.virt.libvirt.host [None req-8d283617-3907-4613-a076-7af437b7681f - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Oct 08 19:00:54 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Oct 08 19:00:54 compute-0 systemd[1]: Stopping nova_compute container...
Oct 08 19:00:54 compute-0 systemd[1]: Started libvirt QEMU daemon.
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.779 2 DEBUG nova.virt.libvirt.host [None req-8d283617-3907-4613-a076-7af437b7681f - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fee62472be0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.781 2 DEBUG nova.virt.libvirt.host [None req-8d283617-3907-4613-a076-7af437b7681f - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fee62472be0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.781 2 INFO nova.virt.libvirt.driver [None req-8d283617-3907-4613-a076-7af437b7681f - - - - - -] Connection event '1' reason 'None'
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.807 2 WARNING nova.virt.libvirt.driver [None req-8d283617-3907-4613-a076-7af437b7681f - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.807 2 DEBUG nova.virt.libvirt.volume.mount [None req-8d283617-3907-4613-a076-7af437b7681f - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.811 2 DEBUG oslo_concurrency.lockutils [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.811 2 DEBUG oslo_concurrency.lockutils [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.811 2 DEBUG oslo_concurrency.lockutils [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:00:55 compute-0 sshd-session[116055]: Received disconnect from 193.46.255.159 port 63470:11:  [preauth]
Oct 08 19:00:55 compute-0 sshd-session[116055]: Disconnected from authenticating user root 193.46.255.159 port 63470 [preauth]
Oct 08 19:00:55 compute-0 sshd-session[116055]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.159  user=root
Oct 08 19:00:55 compute-0 unix_chkpwd[117472]: password check failed for user (root)
Oct 08 19:00:55 compute-0 sshd-session[117461]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.159  user=root
Oct 08 19:00:56 compute-0 virtqemud[117415]: libvirt version: 10.10.0, package: 15.el9 (builder@centos.org, 2025-08-18-13:22:20, )
Oct 08 19:00:56 compute-0 virtqemud[117415]: hostname: compute-0
Oct 08 19:00:56 compute-0 virtqemud[117415]: End of file while reading data: Input/output error
Oct 08 19:00:56 compute-0 systemd[1]: libpod-e51f777915e1b029705fc377b3a37e2f0df1eae4dc423f4a6608ea43f1cd0baf.scope: Deactivated successfully.
Oct 08 19:00:56 compute-0 systemd[1]: libpod-e51f777915e1b029705fc377b3a37e2f0df1eae4dc423f4a6608ea43f1cd0baf.scope: Consumed 3.306s CPU time.
Oct 08 19:00:56 compute-0 podman[117418]: 2025-10-08 19:00:56.086599994 +0000 UTC m=+1.331394185 container died e51f777915e1b029705fc377b3a37e2f0df1eae4dc423f4a6608ea43f1cd0baf (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844, name=nova_compute, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm)
Oct 08 19:00:56 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e51f777915e1b029705fc377b3a37e2f0df1eae4dc423f4a6608ea43f1cd0baf-userdata-shm.mount: Deactivated successfully.
Oct 08 19:00:56 compute-0 systemd[1]: var-lib-containers-storage-overlay-d7ea2da9d416b58111d04fa77b29f528729da1bcd33978a81b2a5bf8cb63050d-merged.mount: Deactivated successfully.
Oct 08 19:00:56 compute-0 podman[117418]: 2025-10-08 19:00:56.257189551 +0000 UTC m=+1.501983742 container cleanup e51f777915e1b029705fc377b3a37e2f0df1eae4dc423f4a6608ea43f1cd0baf (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=edpm)
Oct 08 19:00:56 compute-0 podman[117418]: nova_compute
Oct 08 19:00:56 compute-0 podman[117486]: nova_compute
Oct 08 19:00:56 compute-0 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Oct 08 19:00:56 compute-0 systemd[1]: Stopped nova_compute container.
Oct 08 19:00:56 compute-0 systemd[1]: Starting nova_compute container...
Oct 08 19:00:56 compute-0 systemd[1]: Started libcrun container.
Oct 08 19:00:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d7ea2da9d416b58111d04fa77b29f528729da1bcd33978a81b2a5bf8cb63050d/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Oct 08 19:00:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d7ea2da9d416b58111d04fa77b29f528729da1bcd33978a81b2a5bf8cb63050d/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct 08 19:00:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d7ea2da9d416b58111d04fa77b29f528729da1bcd33978a81b2a5bf8cb63050d/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct 08 19:00:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d7ea2da9d416b58111d04fa77b29f528729da1bcd33978a81b2a5bf8cb63050d/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct 08 19:00:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d7ea2da9d416b58111d04fa77b29f528729da1bcd33978a81b2a5bf8cb63050d/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Oct 08 19:00:56 compute-0 podman[117499]: 2025-10-08 19:00:56.455018661 +0000 UTC m=+0.100517212 container init e51f777915e1b029705fc377b3a37e2f0df1eae4dc423f4a6608ea43f1cd0baf (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844, name=nova_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.license=GPLv2)
Oct 08 19:00:56 compute-0 podman[117499]: 2025-10-08 19:00:56.462271209 +0000 UTC m=+0.107769680 container start e51f777915e1b029705fc377b3a37e2f0df1eae4dc423f4a6608ea43f1cd0baf (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844, name=nova_compute, config_id=edpm, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct 08 19:00:56 compute-0 podman[117499]: nova_compute
Oct 08 19:00:56 compute-0 nova_compute[117514]: + sudo -E kolla_set_configs
Oct 08 19:00:56 compute-0 systemd[1]: Started nova_compute container.
Oct 08 19:00:56 compute-0 sudo[117390]: pam_unix(sudo:session): session closed for user root
Oct 08 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 08 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Validating config file
Oct 08 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 08 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Copying service configuration files
Oct 08 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Deleting /etc/nova/nova.conf
Oct 08 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Oct 08 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Oct 08 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Oct 08 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Oct 08 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Oct 08 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Oct 08 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Oct 08 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Oct 08 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Oct 08 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Oct 08 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Oct 08 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct 08 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct 08 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct 08 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Deleting /etc/ceph
Oct 08 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Creating directory /etc/ceph
Oct 08 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Setting permission for /etc/ceph
Oct 08 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Oct 08 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Oct 08 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct 08 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Oct 08 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Oct 08 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct 08 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Writing out command to execute
Oct 08 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Oct 08 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct 08 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct 08 19:00:56 compute-0 nova_compute[117514]: ++ cat /run_command
Oct 08 19:00:56 compute-0 nova_compute[117514]: + CMD=nova-compute
Oct 08 19:00:56 compute-0 nova_compute[117514]: + ARGS=
Oct 08 19:00:56 compute-0 nova_compute[117514]: + sudo kolla_copy_cacerts
Oct 08 19:00:56 compute-0 nova_compute[117514]: + [[ ! -n '' ]]
Oct 08 19:00:56 compute-0 nova_compute[117514]: + . kolla_extend_start
Oct 08 19:00:56 compute-0 nova_compute[117514]: + echo 'Running command: '\''nova-compute'\'''
Oct 08 19:00:56 compute-0 nova_compute[117514]: Running command: 'nova-compute'
Oct 08 19:00:56 compute-0 nova_compute[117514]: + umask 0022
Oct 08 19:00:56 compute-0 nova_compute[117514]: + exec nova-compute
Oct 08 19:00:57 compute-0 sudo[117675]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjxpasgbapgmnytaegkvmdecepkxyygn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950056.8226206-1838-117162020078414/AnsiballZ_podman_container.py'
Oct 08 19:00:57 compute-0 sudo[117675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:00:57 compute-0 sshd-session[117461]: Failed password for root from 193.46.255.159 port 54464 ssh2
Oct 08 19:00:57 compute-0 python3.9[117677]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Oct 08 19:00:57 compute-0 systemd[1]: Started libpod-conmon-a5a596648e1d0ab8a84f48da9bc406d3b2c789976c4340d1386b8a7ace66d133.scope.
Oct 08 19:00:57 compute-0 systemd[1]: Started libcrun container.
Oct 08 19:00:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/299922c2d269b58288dcf595ac237e859afe8756ba652180b1124ae293d6c96c/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Oct 08 19:00:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/299922c2d269b58288dcf595ac237e859afe8756ba652180b1124ae293d6c96c/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct 08 19:00:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/299922c2d269b58288dcf595ac237e859afe8756ba652180b1124ae293d6c96c/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Oct 08 19:00:57 compute-0 podman[117703]: 2025-10-08 19:00:57.644434761 +0000 UTC m=+0.119530709 container init a5a596648e1d0ab8a84f48da9bc406d3b2c789976c4340d1386b8a7ace66d133 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844, name=nova_compute_init, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, org.label-schema.build-date=20251001)
Oct 08 19:00:57 compute-0 podman[117703]: 2025-10-08 19:00:57.652089321 +0000 UTC m=+0.127185269 container start a5a596648e1d0ab8a84f48da9bc406d3b2c789976c4340d1386b8a7ace66d133 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844, name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=nova_compute_init, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct 08 19:00:57 compute-0 python3.9[117677]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Oct 08 19:00:57 compute-0 nova_compute_init[117725]: INFO:nova_statedir:Applying nova statedir ownership
Oct 08 19:00:57 compute-0 nova_compute_init[117725]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Oct 08 19:00:57 compute-0 nova_compute_init[117725]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Oct 08 19:00:57 compute-0 nova_compute_init[117725]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Oct 08 19:00:57 compute-0 nova_compute_init[117725]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Oct 08 19:00:57 compute-0 nova_compute_init[117725]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Oct 08 19:00:57 compute-0 nova_compute_init[117725]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Oct 08 19:00:57 compute-0 nova_compute_init[117725]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Oct 08 19:00:57 compute-0 nova_compute_init[117725]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Oct 08 19:00:57 compute-0 nova_compute_init[117725]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Oct 08 19:00:57 compute-0 nova_compute_init[117725]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Oct 08 19:00:57 compute-0 nova_compute_init[117725]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Oct 08 19:00:57 compute-0 nova_compute_init[117725]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Oct 08 19:00:57 compute-0 nova_compute_init[117725]: INFO:nova_statedir:Nova statedir ownership complete
Oct 08 19:00:57 compute-0 systemd[1]: libpod-a5a596648e1d0ab8a84f48da9bc406d3b2c789976c4340d1386b8a7ace66d133.scope: Deactivated successfully.
Oct 08 19:00:57 compute-0 podman[117726]: 2025-10-08 19:00:57.725130942 +0000 UTC m=+0.042983507 container died a5a596648e1d0ab8a84f48da9bc406d3b2c789976c4340d1386b8a7ace66d133 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844, name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=nova_compute_init, org.label-schema.build-date=20251001)
Oct 08 19:00:57 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a5a596648e1d0ab8a84f48da9bc406d3b2c789976c4340d1386b8a7ace66d133-userdata-shm.mount: Deactivated successfully.
Oct 08 19:00:57 compute-0 systemd[1]: var-lib-containers-storage-overlay-299922c2d269b58288dcf595ac237e859afe8756ba652180b1124ae293d6c96c-merged.mount: Deactivated successfully.
Oct 08 19:00:57 compute-0 podman[117739]: 2025-10-08 19:00:57.807257475 +0000 UTC m=+0.072261170 container cleanup a5a596648e1d0ab8a84f48da9bc406d3b2c789976c4340d1386b8a7ace66d133 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844, name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']})
Oct 08 19:00:57 compute-0 systemd[1]: libpod-conmon-a5a596648e1d0ab8a84f48da9bc406d3b2c789976c4340d1386b8a7ace66d133.scope: Deactivated successfully.
Oct 08 19:00:57 compute-0 sudo[117675]: pam_unix(sudo:session): session closed for user root
Oct 08 19:00:58 compute-0 sshd-session[83112]: Connection closed by 192.168.122.30 port 37192
Oct 08 19:00:58 compute-0 sshd-session[83109]: pam_unix(sshd:session): session closed for user zuul
Oct 08 19:00:58 compute-0 systemd[1]: session-8.scope: Deactivated successfully.
Oct 08 19:00:58 compute-0 systemd[1]: session-8.scope: Consumed 2min 39.539s CPU time.
Oct 08 19:00:58 compute-0 systemd-logind[844]: Session 8 logged out. Waiting for processes to exit.
Oct 08 19:00:58 compute-0 systemd-logind[844]: Removed session 8.
Oct 08 19:00:58 compute-0 unix_chkpwd[117796]: password check failed for user (root)
Oct 08 19:00:58 compute-0 nova_compute[117514]: 2025-10-08 19:00:58.553 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Oct 08 19:00:58 compute-0 nova_compute[117514]: 2025-10-08 19:00:58.554 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Oct 08 19:00:58 compute-0 nova_compute[117514]: 2025-10-08 19:00:58.554 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Oct 08 19:00:58 compute-0 nova_compute[117514]: 2025-10-08 19:00:58.554 2 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Oct 08 19:00:58 compute-0 nova_compute[117514]: 2025-10-08 19:00:58.678 2 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:00:58 compute-0 nova_compute[117514]: 2025-10-08 19:00:58.707 2 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.230 2 INFO nova.virt.driver [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.373 2 INFO nova.compute.provider_config [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.397 2 DEBUG oslo_concurrency.lockutils [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.397 2 DEBUG oslo_concurrency.lockutils [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.397 2 DEBUG oslo_concurrency.lockutils [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.398 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.398 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.398 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.398 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.399 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.399 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.399 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.399 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.399 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.399 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.400 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.400 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.400 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.400 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.401 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.401 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.401 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.401 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.402 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.402 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.402 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.402 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.403 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.403 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.403 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.403 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.404 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.404 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.404 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.404 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.405 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.405 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.405 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.405 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.406 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.406 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.406 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.407 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.407 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.407 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.407 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.408 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.408 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.408 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.408 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.408 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.409 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.409 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.409 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.409 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.409 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.410 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.410 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.410 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.410 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.410 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.410 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.411 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.411 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.411 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.411 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.411 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.411 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.412 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.412 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.412 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.412 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.412 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.412 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.413 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.413 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.413 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.413 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.413 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.414 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.414 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.414 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.414 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.414 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.414 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.415 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.415 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.415 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.415 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.416 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.416 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.416 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.416 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.416 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.417 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.417 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.417 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.417 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.417 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.418 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.418 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.418 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.418 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.419 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.419 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.419 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.419 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.419 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.420 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.420 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.420 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.420 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.420 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.421 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.421 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.421 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.421 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.421 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.421 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.422 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.422 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.422 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.422 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.422 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.423 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.423 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.423 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.423 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.423 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.424 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.424 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.424 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.424 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.424 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.425 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.425 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.425 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.425 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.425 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.425 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.426 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.426 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.426 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.426 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.426 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.426 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.427 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.427 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.427 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.427 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.427 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.427 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.428 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.428 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.428 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.428 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.428 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.428 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.429 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.429 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.429 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.429 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.429 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.429 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.430 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.430 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.430 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.430 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.430 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.430 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.431 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.431 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.431 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.431 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.431 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.432 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.432 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.432 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.432 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.432 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.433 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.433 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.433 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.433 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.434 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.434 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.434 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.434 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.434 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.435 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.435 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.435 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.435 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.435 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.436 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.436 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.436 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.436 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.436 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.437 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.437 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.437 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.437 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.437 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.437 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.438 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.438 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.438 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.438 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.438 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.438 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.438 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.439 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.439 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.439 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.439 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.439 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.439 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.439 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.440 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.440 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.440 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.440 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.440 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.441 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.441 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.441 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.441 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.441 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.441 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.441 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.441 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.442 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.442 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.442 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.442 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.442 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.442 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.442 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.443 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.443 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.443 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.443 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.443 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.443 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.443 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.444 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.444 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.444 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.444 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.444 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.444 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.444 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.445 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.445 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.445 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.445 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.445 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.445 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.445 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.445 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.446 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.446 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.446 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.446 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.446 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.446 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.446 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.447 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.447 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.447 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.447 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.447 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.447 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.447 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.448 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.448 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.448 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.448 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.448 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.448 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.448 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.449 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.449 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.449 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.449 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.449 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.449 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.449 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.449 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.450 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.450 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.450 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.450 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.450 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.450 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.450 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.451 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.451 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.451 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.451 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.451 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.451 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.451 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.452 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.452 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.452 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.452 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.452 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.452 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.452 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.453 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.453 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.453 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.453 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.453 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.453 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.453 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.454 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.454 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.454 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.454 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.454 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.454 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.455 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.455 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.455 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.455 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.455 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.456 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.456 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.456 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.456 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.456 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.456 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.456 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.457 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.457 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.457 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.457 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.457 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.457 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.457 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.458 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.458 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.458 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.458 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.458 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.458 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.458 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.459 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.459 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.459 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.459 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.459 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.459 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.459 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.460 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.460 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.460 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.460 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.460 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.460 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.461 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.461 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.461 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.461 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.461 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.461 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.461 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.462 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.462 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.462 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.462 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.462 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.462 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.462 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.463 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.463 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.463 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.463 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.463 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.463 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.463 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.463 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.464 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.464 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.464 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.464 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.464 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.464 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.464 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.465 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.465 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.465 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.465 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.465 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.465 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.465 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.466 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.466 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.466 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.466 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.466 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.466 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.466 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.467 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.467 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.467 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.467 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.467 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.467 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.467 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.467 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.468 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.468 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.468 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.468 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.468 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.468 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.468 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.469 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.469 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.469 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.469 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.469 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.469 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.469 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.469 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.470 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.470 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.470 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.470 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.470 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.470 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.470 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.471 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.471 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.471 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.471 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.471 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.471 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.471 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.471 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.472 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.472 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.472 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.472 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.472 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.472 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.472 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.473 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.473 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.473 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.473 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.473 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.473 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.473 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.473 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.474 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.474 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.474 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.474 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.474 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.474 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.474 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.475 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.475 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.475 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.475 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.475 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.475 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.475 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.476 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.476 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.476 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.476 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.476 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.476 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.476 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.476 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.477 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.477 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.477 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.477 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.477 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.477 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.477 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.477 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.478 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.478 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.478 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.478 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.478 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.478 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.478 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.479 2 WARNING oslo_config.cfg [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Oct 08 19:00:59 compute-0 nova_compute[117514]: live_migration_uri is deprecated for removal in favor of two other options that
Oct 08 19:00:59 compute-0 nova_compute[117514]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Oct 08 19:00:59 compute-0 nova_compute[117514]: and ``live_migration_inbound_addr`` respectively.
Oct 08 19:00:59 compute-0 nova_compute[117514]: ).  Its value may be silently ignored in the future.
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.479 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.479 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.479 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.479 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.479 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.480 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.480 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.480 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.480 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.480 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.480 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.480 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.481 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.481 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.481 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.481 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.481 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.481 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.481 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.482 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.482 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.482 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.482 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.482 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.482 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.482 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.483 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.483 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.483 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.483 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.483 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.483 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.484 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.484 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.484 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.484 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.484 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.484 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.484 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.485 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.485 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.485 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.485 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.485 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.485 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.485 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.486 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.486 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.486 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.486 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.486 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.486 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.486 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.487 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.487 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.487 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.487 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.487 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.487 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.487 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.488 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.488 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.488 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.488 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.488 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.488 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.488 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.489 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.489 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.489 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.489 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.489 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.489 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.489 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.490 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.490 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.490 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.490 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.490 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.490 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.490 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.490 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.491 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.491 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.491 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.491 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.491 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.491 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.492 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.492 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.492 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.492 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.492 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.492 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.492 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.493 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.493 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.493 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.493 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.493 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.494 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.494 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.494 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.494 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.494 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.494 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.495 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.495 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.495 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.495 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.495 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.495 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.495 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.496 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.496 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.496 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.496 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.496 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.496 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.496 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.497 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.497 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.497 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.497 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.497 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.497 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.497 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.498 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.498 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.498 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.498 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.498 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.498 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.498 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.499 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.499 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.499 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.499 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.499 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.499 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.500 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.500 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.500 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.500 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.500 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.500 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.500 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.501 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.501 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.501 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.501 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.501 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.501 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.501 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.501 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.502 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.502 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.502 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.502 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.502 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.502 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.502 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.503 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.503 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.503 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.503 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.503 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.504 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.504 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.504 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.504 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.504 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.504 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.504 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.505 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.505 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.505 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.505 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.505 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.505 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.505 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.506 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.506 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.506 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.506 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.506 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.506 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.506 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.507 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.507 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.507 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.507 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.507 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.507 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.507 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.508 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.508 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.508 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.508 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.508 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.508 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.509 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.509 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.509 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.509 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.509 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.509 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.510 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.510 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.510 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.510 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.510 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.510 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.510 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.511 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.511 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.511 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.511 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.511 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.511 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.511 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.512 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.512 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.512 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.512 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.512 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.512 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.512 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.512 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.513 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.513 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.513 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.513 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.513 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.513 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.513 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.514 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.514 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.514 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.514 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.514 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.514 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.514 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.515 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.515 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.515 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.515 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.515 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.515 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.516 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.516 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.516 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.516 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.516 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.516 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.516 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.517 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.517 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.517 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.517 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.517 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.517 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.517 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.517 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.518 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.518 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.518 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.518 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.518 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.518 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.519 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.519 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.519 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.519 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.519 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.519 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.519 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.520 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.520 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.520 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.520 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.520 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.520 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.520 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.521 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.521 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.521 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.521 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.521 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.521 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.522 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.522 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.522 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.522 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.522 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.522 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.523 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.523 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.523 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.523 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.523 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.523 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.524 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.524 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.524 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.524 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.524 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.524 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.524 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.525 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.525 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.525 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.525 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.525 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.525 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.525 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.526 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.526 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.526 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.526 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.526 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.526 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.526 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.526 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.527 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.527 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.527 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.527 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.527 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.527 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.527 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.528 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.528 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.528 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.528 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.528 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.528 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.528 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.529 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.529 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.529 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.529 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.529 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.529 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.529 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.530 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.530 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.530 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.530 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.530 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.530 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.530 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.531 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.531 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.531 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.531 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.531 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.531 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.531 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.532 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.532 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.532 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.532 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.532 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.532 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.532 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.532 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.533 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.533 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.533 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.533 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.533 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.533 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.533 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.534 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.534 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.534 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.534 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.534 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.534 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.534 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.534 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.535 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.535 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.535 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.535 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.535 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.535 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.535 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.536 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.536 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.536 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.536 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.536 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.536 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.536 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.537 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.537 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.537 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.537 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.537 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.537 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.537 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.538 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.538 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.538 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.538 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.538 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.538 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.538 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.539 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.539 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.539 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.539 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.539 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.539 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.539 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.539 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.540 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.540 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.540 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.540 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.540 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.540 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.541 2 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.554 2 DEBUG nova.virt.libvirt.host [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.555 2 DEBUG nova.virt.libvirt.host [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.555 2 DEBUG nova.virt.libvirt.host [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.555 2 DEBUG nova.virt.libvirt.host [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.568 2 DEBUG nova.virt.libvirt.host [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fe40d5c63d0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.570 2 DEBUG nova.virt.libvirt.host [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fe40d5c63d0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.571 2 INFO nova.virt.libvirt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Connection event '1' reason 'None'
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.579 2 INFO nova.virt.libvirt.host [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Libvirt host capabilities <capabilities>
Oct 08 19:00:59 compute-0 nova_compute[117514]: 
Oct 08 19:00:59 compute-0 nova_compute[117514]:   <host>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <uuid>9ff32318-d7e0-4b37-bb6e-ea4cfd795672</uuid>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <cpu>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <arch>x86_64</arch>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model>EPYC-Rome-v4</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <vendor>AMD</vendor>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <microcode version='16777317'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <signature family='23' model='49' stepping='0'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <maxphysaddr mode='emulate' bits='40'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature name='x2apic'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature name='tsc-deadline'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature name='osxsave'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature name='hypervisor'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature name='tsc_adjust'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature name='spec-ctrl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature name='stibp'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature name='arch-capabilities'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature name='ssbd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature name='cmp_legacy'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature name='topoext'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature name='virt-ssbd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature name='lbrv'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature name='tsc-scale'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature name='vmcb-clean'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature name='pause-filter'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature name='pfthreshold'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature name='svme-addr-chk'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature name='rdctl-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature name='skip-l1dfl-vmentry'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature name='mds-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature name='pschange-mc-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <pages unit='KiB' size='4'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <pages unit='KiB' size='2048'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <pages unit='KiB' size='1048576'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </cpu>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <power_management>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <suspend_mem/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <suspend_disk/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <suspend_hybrid/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </power_management>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <iommu support='no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <migration_features>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <live/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <uri_transports>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <uri_transport>tcp</uri_transport>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <uri_transport>rdma</uri_transport>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </uri_transports>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </migration_features>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <topology>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <cells num='1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <cell id='0'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:           <memory unit='KiB'>7864104</memory>
Oct 08 19:00:59 compute-0 nova_compute[117514]:           <pages unit='KiB' size='4'>1966026</pages>
Oct 08 19:00:59 compute-0 nova_compute[117514]:           <pages unit='KiB' size='2048'>0</pages>
Oct 08 19:00:59 compute-0 nova_compute[117514]:           <pages unit='KiB' size='1048576'>0</pages>
Oct 08 19:00:59 compute-0 nova_compute[117514]:           <distances>
Oct 08 19:00:59 compute-0 nova_compute[117514]:             <sibling id='0' value='10'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:           </distances>
Oct 08 19:00:59 compute-0 nova_compute[117514]:           <cpus num='8'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:           </cpus>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         </cell>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </cells>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </topology>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <cache>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </cache>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <secmodel>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model>selinux</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <doi>0</doi>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </secmodel>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <secmodel>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model>dac</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <doi>0</doi>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <baselabel type='kvm'>+107:+107</baselabel>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <baselabel type='qemu'>+107:+107</baselabel>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </secmodel>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   </host>
Oct 08 19:00:59 compute-0 nova_compute[117514]: 
Oct 08 19:00:59 compute-0 nova_compute[117514]:   <guest>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <os_type>hvm</os_type>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <arch name='i686'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <wordsize>32</wordsize>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <domain type='qemu'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <domain type='kvm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </arch>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <features>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <pae/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <nonpae/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <acpi default='on' toggle='yes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <apic default='on' toggle='no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <cpuselection/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <deviceboot/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <disksnapshot default='on' toggle='no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <externalSnapshot/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </features>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   </guest>
Oct 08 19:00:59 compute-0 nova_compute[117514]: 
Oct 08 19:00:59 compute-0 nova_compute[117514]:   <guest>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <os_type>hvm</os_type>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <arch name='x86_64'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <wordsize>64</wordsize>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <domain type='qemu'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <domain type='kvm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </arch>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <features>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <acpi default='on' toggle='yes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <apic default='on' toggle='no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <cpuselection/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <deviceboot/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <disksnapshot default='on' toggle='no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <externalSnapshot/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </features>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   </guest>
Oct 08 19:00:59 compute-0 nova_compute[117514]: 
Oct 08 19:00:59 compute-0 nova_compute[117514]: </capabilities>
Oct 08 19:00:59 compute-0 nova_compute[117514]: 
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.584 2 WARNING nova.virt.libvirt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.584 2 DEBUG nova.virt.libvirt.volume.mount [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.586 2 DEBUG nova.virt.libvirt.host [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.610 2 DEBUG nova.virt.libvirt.host [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Oct 08 19:00:59 compute-0 nova_compute[117514]: <domainCapabilities>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   <path>/usr/libexec/qemu-kvm</path>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   <domain>kvm</domain>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   <machine>pc-q35-rhel9.6.0</machine>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   <arch>i686</arch>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   <vcpu max='4096'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   <iothreads supported='yes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   <os supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <enum name='firmware'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <loader supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='type'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>rom</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>pflash</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='readonly'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>yes</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>no</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='secure'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>no</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </loader>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   </os>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   <cpu>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <mode name='host-passthrough' supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='hostPassthroughMigratable'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>on</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>off</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </mode>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <mode name='maximum' supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='maximumMigratable'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>on</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>off</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </mode>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <mode name='host-model' supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model fallback='forbid'>EPYC-Rome</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <vendor>AMD</vendor>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <maxphysaddr mode='passthrough' limit='40'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='x2apic'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='tsc-deadline'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='hypervisor'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='tsc_adjust'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='spec-ctrl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='stibp'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='arch-capabilities'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='ssbd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='cmp_legacy'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='overflow-recov'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='succor'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='ibrs'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='amd-ssbd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='virt-ssbd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='lbrv'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='tsc-scale'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='vmcb-clean'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='flushbyasid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='pause-filter'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='pfthreshold'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='svme-addr-chk'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='lfence-always-serializing'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='rdctl-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='mds-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='pschange-mc-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='gds-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='rfds-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='disable' name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </mode>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <mode name='custom' supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Broadwell'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Broadwell-IBRS'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Broadwell-noTSX'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Broadwell-noTSX-IBRS'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Broadwell-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Broadwell-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Broadwell-v3'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Broadwell-v4'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Cascadelake-Server'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Cascadelake-Server-noTSX'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Cascadelake-Server-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Cascadelake-Server-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Cascadelake-Server-v3'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Cascadelake-Server-v4'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Cascadelake-Server-v5'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Cooperlake'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Cooperlake-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Cooperlake-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Denverton'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='mpx'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Denverton-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='mpx'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Denverton-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Denverton-v3'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Dhyana-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='EPYC-Genoa'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amd-psfd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='auto-ibrs'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='no-nested-data-bp'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='null-sel-clr-base'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='stibp-always-on'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='EPYC-Genoa-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amd-psfd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='auto-ibrs'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='no-nested-data-bp'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='null-sel-clr-base'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='stibp-always-on'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='EPYC-Milan'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='EPYC-Milan-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='EPYC-Milan-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amd-psfd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='no-nested-data-bp'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='null-sel-clr-base'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='stibp-always-on'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='EPYC-Rome'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='EPYC-Rome-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='EPYC-Rome-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='EPYC-Rome-v3'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='EPYC-v3'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='EPYC-v4'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='GraniteRapids'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-fp16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-int8'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-tile'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-fp16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='bus-lock-detect'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fbsdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrc'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrs'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fzrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='mcdt-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pbrsb-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='prefetchiti'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='psdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='sbdr-ssdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='serialize'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='tsx-ldtrk'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xfd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='GraniteRapids-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-fp16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-int8'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-tile'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-fp16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='bus-lock-detect'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fbsdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrc'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrs'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fzrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='mcdt-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pbrsb-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='prefetchiti'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='psdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='sbdr-ssdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='serialize'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='tsx-ldtrk'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xfd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='GraniteRapids-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-fp16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-int8'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-tile'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx10'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx10-128'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx10-256'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx10-512'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-fp16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='bus-lock-detect'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='cldemote'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fbsdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrc'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrs'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fzrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='mcdt-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdir64b'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdiri'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pbrsb-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='prefetchiti'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='psdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='sbdr-ssdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='serialize'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ss'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='tsx-ldtrk'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xfd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Haswell'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Haswell-IBRS'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Haswell-noTSX'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Haswell-noTSX-IBRS'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Haswell-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Haswell-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Haswell-v3'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Haswell-v4'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Icelake-Server'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Icelake-Server-noTSX'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Icelake-Server-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Icelake-Server-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Icelake-Server-v3'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Icelake-Server-v4'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Icelake-Server-v5'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Icelake-Server-v6'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Icelake-Server-v7'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='IvyBridge'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='IvyBridge-IBRS'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='IvyBridge-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='IvyBridge-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='KnightsMill'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-4fmaps'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-4vnniw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512er'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512pf'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ss'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='KnightsMill-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-4fmaps'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-4vnniw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512er'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512pf'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ss'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Opteron_G4'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fma4'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xop'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Opteron_G4-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fma4'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xop'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Opteron_G5'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fma4'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='tbm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xop'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Opteron_G5-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fma4'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='tbm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xop'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='SapphireRapids'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-int8'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-tile'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-fp16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='bus-lock-detect'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrc'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrs'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fzrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='serialize'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='tsx-ldtrk'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xfd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='SapphireRapids-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-int8'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-tile'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-fp16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='bus-lock-detect'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrc'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrs'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fzrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='serialize'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='tsx-ldtrk'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xfd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='SapphireRapids-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-int8'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-tile'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-fp16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='bus-lock-detect'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fbsdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrc'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrs'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fzrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='psdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='sbdr-ssdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='serialize'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='tsx-ldtrk'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xfd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='SapphireRapids-v3'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-int8'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-tile'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-fp16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='bus-lock-detect'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='cldemote'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fbsdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrc'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrs'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fzrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdir64b'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdiri'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='psdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='sbdr-ssdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='serialize'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ss'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='tsx-ldtrk'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xfd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='SierraForest'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-ne-convert'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-vnni-int8'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='bus-lock-detect'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='cmpccxadd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fbsdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrs'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='mcdt-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pbrsb-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='psdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='sbdr-ssdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='serialize'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='SierraForest-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-ne-convert'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-vnni-int8'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='bus-lock-detect'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='cmpccxadd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fbsdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrs'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='mcdt-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pbrsb-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='psdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='sbdr-ssdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='serialize'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Client'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Client-IBRS'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Client-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Client-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Client-v3'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Client-v4'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Server'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Server-IBRS'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Server-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Server-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Server-v3'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Server-v4'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Server-v5'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Snowridge'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='cldemote'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='core-capability'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdir64b'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdiri'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='mpx'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='split-lock-detect'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Snowridge-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='cldemote'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='core-capability'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdir64b'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdiri'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='mpx'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='split-lock-detect'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Snowridge-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='cldemote'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='core-capability'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdir64b'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdiri'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='split-lock-detect'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Snowridge-v3'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='cldemote'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='core-capability'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdir64b'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdiri'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='split-lock-detect'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Snowridge-v4'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='cldemote'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdir64b'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdiri'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='athlon'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='3dnow'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='3dnowext'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='athlon-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='3dnow'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='3dnowext'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='core2duo'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ss'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='core2duo-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ss'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='coreduo'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ss'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='coreduo-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ss'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='n270'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ss'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='n270-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ss'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='phenom'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='3dnow'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='3dnowext'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='phenom-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='3dnow'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='3dnowext'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </mode>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   </cpu>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   <memoryBacking supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <enum name='sourceType'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <value>file</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <value>anonymous</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <value>memfd</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   </memoryBacking>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   <devices>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <disk supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='diskDevice'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>disk</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>cdrom</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>floppy</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>lun</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='bus'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>fdc</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>scsi</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>virtio</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>usb</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>sata</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='model'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>virtio</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>virtio-transitional</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>virtio-non-transitional</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </disk>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <graphics supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='type'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>vnc</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>egl-headless</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>dbus</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </graphics>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <video supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='modelType'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>vga</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>cirrus</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>virtio</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>none</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>bochs</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>ramfb</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </video>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <hostdev supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='mode'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>subsystem</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='startupPolicy'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>default</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>mandatory</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>requisite</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>optional</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='subsysType'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>usb</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>pci</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>scsi</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='capsType'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='pciBackend'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </hostdev>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <rng supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='model'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>virtio</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>virtio-transitional</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>virtio-non-transitional</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='backendModel'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>random</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>egd</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>builtin</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </rng>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <filesystem supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='driverType'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>path</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>handle</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>virtiofs</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </filesystem>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <tpm supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='model'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>tpm-tis</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>tpm-crb</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='backendModel'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>emulator</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>external</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='backendVersion'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>2.0</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </tpm>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <redirdev supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='bus'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>usb</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </redirdev>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <channel supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='type'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>pty</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>unix</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </channel>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <crypto supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='model'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='type'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>qemu</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='backendModel'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>builtin</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </crypto>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <interface supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='backendType'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>default</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>passt</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </interface>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <panic supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='model'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>isa</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>hyperv</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </panic>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   </devices>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   <features>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <gic supported='no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <vmcoreinfo supported='yes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <genid supported='yes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <backingStoreInput supported='yes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <backup supported='yes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <async-teardown supported='yes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <ps2 supported='yes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <sev supported='no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <sgx supported='no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <hyperv supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='features'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>relaxed</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>vapic</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>spinlocks</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>vpindex</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>runtime</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>synic</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>stimer</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>reset</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>vendor_id</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>frequencies</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>reenlightenment</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>tlbflush</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>ipi</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>avic</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>emsr_bitmap</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>xmm_input</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </hyperv>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <launchSecurity supported='no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   </features>
Oct 08 19:00:59 compute-0 nova_compute[117514]: </domainCapabilities>
Oct 08 19:00:59 compute-0 nova_compute[117514]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.616 2 DEBUG nova.virt.libvirt.host [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Oct 08 19:00:59 compute-0 nova_compute[117514]: <domainCapabilities>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   <path>/usr/libexec/qemu-kvm</path>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   <domain>kvm</domain>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   <machine>pc-i440fx-rhel7.6.0</machine>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   <arch>i686</arch>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   <vcpu max='240'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   <iothreads supported='yes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   <os supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <enum name='firmware'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <loader supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='type'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>rom</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>pflash</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='readonly'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>yes</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>no</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='secure'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>no</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </loader>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   </os>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   <cpu>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <mode name='host-passthrough' supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='hostPassthroughMigratable'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>on</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>off</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </mode>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <mode name='maximum' supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='maximumMigratable'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>on</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>off</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </mode>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <mode name='host-model' supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model fallback='forbid'>EPYC-Rome</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <vendor>AMD</vendor>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <maxphysaddr mode='passthrough' limit='40'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='x2apic'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='tsc-deadline'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='hypervisor'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='tsc_adjust'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='spec-ctrl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='stibp'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='arch-capabilities'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='ssbd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='cmp_legacy'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='overflow-recov'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='succor'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='ibrs'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='amd-ssbd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='virt-ssbd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='lbrv'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='tsc-scale'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='vmcb-clean'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='flushbyasid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='pause-filter'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='pfthreshold'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='svme-addr-chk'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='lfence-always-serializing'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='rdctl-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='mds-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='pschange-mc-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='gds-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='rfds-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='disable' name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </mode>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <mode name='custom' supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Broadwell'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Broadwell-IBRS'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Broadwell-noTSX'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Broadwell-noTSX-IBRS'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Broadwell-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Broadwell-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Broadwell-v3'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Broadwell-v4'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Cascadelake-Server'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Cascadelake-Server-noTSX'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Cascadelake-Server-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Cascadelake-Server-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Cascadelake-Server-v3'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Cascadelake-Server-v4'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Cascadelake-Server-v5'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Cooperlake'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Cooperlake-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Cooperlake-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Denverton'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='mpx'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Denverton-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='mpx'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Denverton-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Denverton-v3'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Dhyana-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='EPYC-Genoa'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amd-psfd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='auto-ibrs'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='no-nested-data-bp'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='null-sel-clr-base'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='stibp-always-on'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='EPYC-Genoa-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amd-psfd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='auto-ibrs'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='no-nested-data-bp'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='null-sel-clr-base'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='stibp-always-on'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='EPYC-Milan'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='EPYC-Milan-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='EPYC-Milan-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amd-psfd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='no-nested-data-bp'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='null-sel-clr-base'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='stibp-always-on'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='EPYC-Rome'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='EPYC-Rome-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='EPYC-Rome-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='EPYC-Rome-v3'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='EPYC-v3'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='EPYC-v4'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='GraniteRapids'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-fp16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-int8'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-tile'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-fp16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='bus-lock-detect'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fbsdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrc'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrs'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fzrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='mcdt-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pbrsb-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='prefetchiti'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='psdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='sbdr-ssdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='serialize'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='tsx-ldtrk'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xfd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='GraniteRapids-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-fp16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-int8'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-tile'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-fp16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='bus-lock-detect'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fbsdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrc'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrs'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fzrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='mcdt-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pbrsb-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='prefetchiti'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='psdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='sbdr-ssdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='serialize'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='tsx-ldtrk'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xfd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='GraniteRapids-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-fp16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-int8'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-tile'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx10'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx10-128'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx10-256'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx10-512'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-fp16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='bus-lock-detect'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='cldemote'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fbsdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrc'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrs'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fzrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='mcdt-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdir64b'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdiri'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pbrsb-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='prefetchiti'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='psdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='sbdr-ssdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='serialize'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ss'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='tsx-ldtrk'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xfd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Haswell'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Haswell-IBRS'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Haswell-noTSX'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Haswell-noTSX-IBRS'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Haswell-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Haswell-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Haswell-v3'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Haswell-v4'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Icelake-Server'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Icelake-Server-noTSX'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Icelake-Server-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Icelake-Server-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Icelake-Server-v3'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Icelake-Server-v4'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Icelake-Server-v5'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Icelake-Server-v6'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Icelake-Server-v7'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='IvyBridge'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='IvyBridge-IBRS'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='IvyBridge-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='IvyBridge-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='KnightsMill'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-4fmaps'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-4vnniw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512er'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512pf'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ss'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='KnightsMill-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-4fmaps'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-4vnniw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512er'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512pf'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ss'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Opteron_G4'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fma4'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xop'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Opteron_G4-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fma4'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xop'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Opteron_G5'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fma4'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='tbm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xop'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Opteron_G5-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fma4'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='tbm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xop'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='SapphireRapids'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-int8'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-tile'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-fp16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='bus-lock-detect'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrc'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrs'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fzrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='serialize'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='tsx-ldtrk'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xfd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='SapphireRapids-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-int8'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-tile'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-fp16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='bus-lock-detect'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrc'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrs'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fzrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='serialize'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='tsx-ldtrk'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xfd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='SapphireRapids-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-int8'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-tile'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-fp16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='bus-lock-detect'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fbsdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrc'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrs'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fzrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='psdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='sbdr-ssdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='serialize'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='tsx-ldtrk'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xfd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='SapphireRapids-v3'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-int8'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-tile'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-fp16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='bus-lock-detect'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='cldemote'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fbsdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrc'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrs'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fzrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdir64b'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdiri'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='psdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='sbdr-ssdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='serialize'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ss'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='tsx-ldtrk'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xfd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='SierraForest'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-ne-convert'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-vnni-int8'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='bus-lock-detect'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='cmpccxadd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fbsdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrs'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='mcdt-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pbrsb-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='psdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='sbdr-ssdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='serialize'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='SierraForest-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-ne-convert'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-vnni-int8'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='bus-lock-detect'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='cmpccxadd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fbsdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrs'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='mcdt-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pbrsb-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='psdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='sbdr-ssdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='serialize'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Client'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Client-IBRS'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Client-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Client-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Client-v3'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Client-v4'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Server'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Server-IBRS'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Server-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Server-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Server-v3'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Server-v4'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Server-v5'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Snowridge'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='cldemote'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='core-capability'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdir64b'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdiri'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='mpx'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='split-lock-detect'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Snowridge-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='cldemote'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='core-capability'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdir64b'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdiri'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='mpx'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='split-lock-detect'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Snowridge-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='cldemote'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='core-capability'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdir64b'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdiri'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='split-lock-detect'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Snowridge-v3'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='cldemote'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='core-capability'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdir64b'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdiri'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='split-lock-detect'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Snowridge-v4'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='cldemote'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdir64b'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdiri'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='athlon'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='3dnow'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='3dnowext'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='athlon-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='3dnow'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='3dnowext'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='core2duo'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ss'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='core2duo-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ss'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='coreduo'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ss'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='coreduo-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ss'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='n270'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ss'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='n270-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ss'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='phenom'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='3dnow'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='3dnowext'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='phenom-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='3dnow'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='3dnowext'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </mode>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   </cpu>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   <memoryBacking supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <enum name='sourceType'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <value>file</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <value>anonymous</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <value>memfd</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   </memoryBacking>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   <devices>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <disk supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='diskDevice'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>disk</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>cdrom</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>floppy</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>lun</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='bus'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>ide</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>fdc</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>scsi</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>virtio</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>usb</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>sata</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='model'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>virtio</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>virtio-transitional</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>virtio-non-transitional</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </disk>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <graphics supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='type'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>vnc</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>egl-headless</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>dbus</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </graphics>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <video supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='modelType'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>vga</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>cirrus</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>virtio</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>none</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>bochs</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>ramfb</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </video>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <hostdev supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='mode'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>subsystem</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='startupPolicy'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>default</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>mandatory</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>requisite</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>optional</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='subsysType'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>usb</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>pci</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>scsi</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='capsType'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='pciBackend'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </hostdev>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <rng supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='model'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>virtio</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>virtio-transitional</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>virtio-non-transitional</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='backendModel'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>random</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>egd</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>builtin</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </rng>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <filesystem supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='driverType'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>path</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>handle</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>virtiofs</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </filesystem>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <tpm supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='model'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>tpm-tis</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>tpm-crb</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='backendModel'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>emulator</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>external</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='backendVersion'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>2.0</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </tpm>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <redirdev supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='bus'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>usb</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </redirdev>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <channel supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='type'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>pty</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>unix</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </channel>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <crypto supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='model'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='type'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>qemu</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='backendModel'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>builtin</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </crypto>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <interface supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='backendType'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>default</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>passt</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </interface>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <panic supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='model'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>isa</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>hyperv</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </panic>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   </devices>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   <features>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <gic supported='no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <vmcoreinfo supported='yes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <genid supported='yes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <backingStoreInput supported='yes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <backup supported='yes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <async-teardown supported='yes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <ps2 supported='yes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <sev supported='no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <sgx supported='no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <hyperv supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='features'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>relaxed</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>vapic</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>spinlocks</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>vpindex</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>runtime</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>synic</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>stimer</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>reset</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>vendor_id</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>frequencies</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>reenlightenment</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>tlbflush</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>ipi</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>avic</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>emsr_bitmap</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>xmm_input</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </hyperv>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <launchSecurity supported='no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   </features>
Oct 08 19:00:59 compute-0 nova_compute[117514]: </domainCapabilities>
Oct 08 19:00:59 compute-0 nova_compute[117514]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.660 2 DEBUG nova.virt.libvirt.host [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.664 2 DEBUG nova.virt.libvirt.host [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Oct 08 19:00:59 compute-0 nova_compute[117514]: <domainCapabilities>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   <path>/usr/libexec/qemu-kvm</path>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   <domain>kvm</domain>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   <machine>pc-i440fx-rhel7.6.0</machine>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   <arch>x86_64</arch>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   <vcpu max='240'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   <iothreads supported='yes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   <os supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <enum name='firmware'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <loader supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='type'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>rom</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>pflash</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='readonly'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>yes</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>no</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='secure'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>no</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </loader>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   </os>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   <cpu>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <mode name='host-passthrough' supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='hostPassthroughMigratable'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>on</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>off</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </mode>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <mode name='maximum' supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='maximumMigratable'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>on</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>off</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </mode>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <mode name='host-model' supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model fallback='forbid'>EPYC-Rome</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <vendor>AMD</vendor>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <maxphysaddr mode='passthrough' limit='40'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='x2apic'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='tsc-deadline'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='hypervisor'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='tsc_adjust'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='spec-ctrl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='stibp'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='arch-capabilities'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='ssbd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='cmp_legacy'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='overflow-recov'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='succor'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='ibrs'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='amd-ssbd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='virt-ssbd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='lbrv'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='tsc-scale'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='vmcb-clean'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='flushbyasid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='pause-filter'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='pfthreshold'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='svme-addr-chk'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='lfence-always-serializing'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='rdctl-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='mds-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='pschange-mc-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='gds-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='rfds-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='disable' name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </mode>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <mode name='custom' supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Broadwell'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Broadwell-IBRS'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Broadwell-noTSX'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Broadwell-noTSX-IBRS'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Broadwell-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Broadwell-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Broadwell-v3'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Broadwell-v4'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Cascadelake-Server'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Cascadelake-Server-noTSX'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Cascadelake-Server-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Cascadelake-Server-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Cascadelake-Server-v3'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Cascadelake-Server-v4'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Cascadelake-Server-v5'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Cooperlake'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Cooperlake-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Cooperlake-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Denverton'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='mpx'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Denverton-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='mpx'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Denverton-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Denverton-v3'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Dhyana-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='EPYC-Genoa'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amd-psfd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='auto-ibrs'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='no-nested-data-bp'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='null-sel-clr-base'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='stibp-always-on'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='EPYC-Genoa-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amd-psfd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='auto-ibrs'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='no-nested-data-bp'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='null-sel-clr-base'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='stibp-always-on'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='EPYC-Milan'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='EPYC-Milan-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='EPYC-Milan-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amd-psfd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='no-nested-data-bp'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='null-sel-clr-base'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='stibp-always-on'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='EPYC-Rome'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='EPYC-Rome-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='EPYC-Rome-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='EPYC-Rome-v3'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='EPYC-v3'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='EPYC-v4'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='GraniteRapids'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-fp16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-int8'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-tile'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-fp16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='bus-lock-detect'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fbsdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrc'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrs'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fzrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='mcdt-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pbrsb-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='prefetchiti'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='psdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='sbdr-ssdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='serialize'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='tsx-ldtrk'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xfd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='GraniteRapids-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-fp16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-int8'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-tile'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-fp16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='bus-lock-detect'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fbsdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrc'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrs'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fzrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='mcdt-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pbrsb-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='prefetchiti'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='psdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='sbdr-ssdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='serialize'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='tsx-ldtrk'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xfd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='GraniteRapids-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-fp16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-int8'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-tile'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx10'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx10-128'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx10-256'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx10-512'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-fp16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='bus-lock-detect'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='cldemote'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fbsdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrc'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrs'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fzrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='mcdt-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdir64b'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdiri'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pbrsb-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='prefetchiti'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='psdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='sbdr-ssdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='serialize'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ss'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='tsx-ldtrk'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xfd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Haswell'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Haswell-IBRS'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Haswell-noTSX'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Haswell-noTSX-IBRS'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Haswell-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Haswell-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Haswell-v3'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Haswell-v4'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Icelake-Server'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Icelake-Server-noTSX'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Icelake-Server-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Icelake-Server-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Icelake-Server-v3'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Icelake-Server-v4'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Icelake-Server-v5'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Icelake-Server-v6'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Icelake-Server-v7'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='IvyBridge'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='IvyBridge-IBRS'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='IvyBridge-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='IvyBridge-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='KnightsMill'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-4fmaps'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-4vnniw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512er'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512pf'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ss'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='KnightsMill-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-4fmaps'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-4vnniw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512er'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512pf'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ss'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Opteron_G4'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fma4'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xop'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Opteron_G4-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fma4'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xop'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Opteron_G5'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fma4'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='tbm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xop'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Opteron_G5-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fma4'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='tbm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xop'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='SapphireRapids'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-int8'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-tile'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-fp16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='bus-lock-detect'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrc'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrs'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fzrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='serialize'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='tsx-ldtrk'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xfd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='SapphireRapids-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-int8'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-tile'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-fp16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='bus-lock-detect'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrc'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrs'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fzrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='serialize'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='tsx-ldtrk'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xfd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='SapphireRapids-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-int8'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-tile'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-fp16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='bus-lock-detect'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fbsdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrc'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrs'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fzrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='psdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='sbdr-ssdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='serialize'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='tsx-ldtrk'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xfd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='SapphireRapids-v3'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-int8'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-tile'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-fp16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='bus-lock-detect'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='cldemote'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fbsdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrc'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrs'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fzrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdir64b'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdiri'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='psdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='sbdr-ssdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='serialize'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ss'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='tsx-ldtrk'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xfd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='SierraForest'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-ne-convert'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-vnni-int8'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='bus-lock-detect'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='cmpccxadd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fbsdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrs'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='mcdt-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pbrsb-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='psdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='sbdr-ssdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='serialize'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='SierraForest-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-ne-convert'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-vnni-int8'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='bus-lock-detect'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='cmpccxadd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fbsdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrs'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='mcdt-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pbrsb-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='psdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='sbdr-ssdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='serialize'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Client'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Client-IBRS'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Client-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Client-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Client-v3'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Client-v4'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Server'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Server-IBRS'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Server-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Server-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Server-v3'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Server-v4'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Server-v5'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Snowridge'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='cldemote'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='core-capability'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdir64b'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdiri'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='mpx'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='split-lock-detect'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Snowridge-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='cldemote'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='core-capability'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdir64b'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdiri'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='mpx'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='split-lock-detect'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Snowridge-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='cldemote'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='core-capability'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdir64b'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdiri'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='split-lock-detect'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Snowridge-v3'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='cldemote'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='core-capability'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdir64b'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdiri'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='split-lock-detect'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Snowridge-v4'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='cldemote'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdir64b'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdiri'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='athlon'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='3dnow'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='3dnowext'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='athlon-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='3dnow'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='3dnowext'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='core2duo'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ss'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='core2duo-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ss'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='coreduo'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ss'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='coreduo-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ss'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='n270'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ss'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='n270-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ss'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='phenom'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='3dnow'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='3dnowext'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='phenom-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='3dnow'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='3dnowext'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </mode>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   </cpu>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   <memoryBacking supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <enum name='sourceType'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <value>file</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <value>anonymous</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <value>memfd</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   </memoryBacking>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   <devices>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <disk supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='diskDevice'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>disk</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>cdrom</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>floppy</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>lun</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='bus'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>ide</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>fdc</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>scsi</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>virtio</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>usb</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>sata</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='model'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>virtio</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>virtio-transitional</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>virtio-non-transitional</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </disk>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <graphics supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='type'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>vnc</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>egl-headless</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>dbus</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </graphics>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <video supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='modelType'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>vga</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>cirrus</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>virtio</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>none</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>bochs</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>ramfb</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </video>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <hostdev supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='mode'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>subsystem</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='startupPolicy'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>default</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>mandatory</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>requisite</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>optional</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='subsysType'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>usb</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>pci</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>scsi</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='capsType'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='pciBackend'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </hostdev>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <rng supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='model'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>virtio</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>virtio-transitional</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>virtio-non-transitional</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='backendModel'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>random</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>egd</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>builtin</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </rng>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <filesystem supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='driverType'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>path</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>handle</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>virtiofs</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </filesystem>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <tpm supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='model'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>tpm-tis</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>tpm-crb</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='backendModel'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>emulator</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>external</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='backendVersion'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>2.0</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </tpm>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <redirdev supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='bus'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>usb</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </redirdev>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <channel supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='type'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>pty</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>unix</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </channel>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <crypto supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='model'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='type'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>qemu</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='backendModel'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>builtin</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </crypto>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <interface supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='backendType'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>default</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>passt</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </interface>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <panic supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='model'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>isa</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>hyperv</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </panic>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   </devices>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   <features>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <gic supported='no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <vmcoreinfo supported='yes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <genid supported='yes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <backingStoreInput supported='yes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <backup supported='yes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <async-teardown supported='yes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <ps2 supported='yes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <sev supported='no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <sgx supported='no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <hyperv supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='features'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>relaxed</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>vapic</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>spinlocks</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>vpindex</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>runtime</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>synic</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>stimer</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>reset</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>vendor_id</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>frequencies</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>reenlightenment</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>tlbflush</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>ipi</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>avic</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>emsr_bitmap</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>xmm_input</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </hyperv>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <launchSecurity supported='no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   </features>
Oct 08 19:00:59 compute-0 nova_compute[117514]: </domainCapabilities>
Oct 08 19:00:59 compute-0 nova_compute[117514]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.729 2 DEBUG nova.virt.libvirt.host [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Oct 08 19:00:59 compute-0 nova_compute[117514]: <domainCapabilities>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   <path>/usr/libexec/qemu-kvm</path>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   <domain>kvm</domain>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   <machine>pc-q35-rhel9.6.0</machine>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   <arch>x86_64</arch>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   <vcpu max='4096'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   <iothreads supported='yes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   <os supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <enum name='firmware'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <value>efi</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <loader supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='type'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>rom</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>pflash</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='readonly'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>yes</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>no</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='secure'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>yes</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>no</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </loader>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   </os>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   <cpu>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <mode name='host-passthrough' supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='hostPassthroughMigratable'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>on</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>off</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </mode>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <mode name='maximum' supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='maximumMigratable'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>on</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>off</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </mode>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <mode name='host-model' supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model fallback='forbid'>EPYC-Rome</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <vendor>AMD</vendor>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <maxphysaddr mode='passthrough' limit='40'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='x2apic'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='tsc-deadline'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='hypervisor'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='tsc_adjust'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='spec-ctrl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='stibp'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='arch-capabilities'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='ssbd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='cmp_legacy'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='overflow-recov'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='succor'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='ibrs'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='amd-ssbd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='virt-ssbd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='lbrv'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='tsc-scale'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='vmcb-clean'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='flushbyasid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='pause-filter'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='pfthreshold'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='svme-addr-chk'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='lfence-always-serializing'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='rdctl-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='mds-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='pschange-mc-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='gds-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='require' name='rfds-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <feature policy='disable' name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </mode>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <mode name='custom' supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Broadwell'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Broadwell-IBRS'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Broadwell-noTSX'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Broadwell-noTSX-IBRS'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Broadwell-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Broadwell-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Broadwell-v3'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Broadwell-v4'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Cascadelake-Server'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Cascadelake-Server-noTSX'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Cascadelake-Server-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Cascadelake-Server-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Cascadelake-Server-v3'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Cascadelake-Server-v4'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Cascadelake-Server-v5'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Cooperlake'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Cooperlake-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Cooperlake-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Denverton'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='mpx'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Denverton-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='mpx'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Denverton-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Denverton-v3'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Dhyana-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='EPYC-Genoa'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amd-psfd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='auto-ibrs'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='no-nested-data-bp'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='null-sel-clr-base'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='stibp-always-on'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='EPYC-Genoa-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amd-psfd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='auto-ibrs'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='no-nested-data-bp'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='null-sel-clr-base'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='stibp-always-on'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='EPYC-Milan'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='EPYC-Milan-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='EPYC-Milan-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amd-psfd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='no-nested-data-bp'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='null-sel-clr-base'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='stibp-always-on'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='EPYC-Rome'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='EPYC-Rome-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='EPYC-Rome-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='EPYC-Rome-v3'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='EPYC-v3'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='EPYC-v4'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='GraniteRapids'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-fp16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-int8'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-tile'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-fp16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='bus-lock-detect'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fbsdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrc'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrs'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fzrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='mcdt-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pbrsb-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='prefetchiti'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='psdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='sbdr-ssdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='serialize'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='tsx-ldtrk'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xfd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='GraniteRapids-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-fp16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-int8'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-tile'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-fp16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='bus-lock-detect'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fbsdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrc'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrs'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fzrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='mcdt-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pbrsb-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='prefetchiti'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='psdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='sbdr-ssdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='serialize'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='tsx-ldtrk'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xfd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='GraniteRapids-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-fp16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-int8'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-tile'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx10'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx10-128'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx10-256'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx10-512'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-fp16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='bus-lock-detect'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='cldemote'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fbsdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrc'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrs'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fzrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='mcdt-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdir64b'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdiri'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pbrsb-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='prefetchiti'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='psdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='sbdr-ssdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='serialize'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ss'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='tsx-ldtrk'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xfd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Haswell'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Haswell-IBRS'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Haswell-noTSX'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Haswell-noTSX-IBRS'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Haswell-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Haswell-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Haswell-v3'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Haswell-v4'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Icelake-Server'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Icelake-Server-noTSX'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Icelake-Server-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Icelake-Server-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Icelake-Server-v3'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Icelake-Server-v4'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Icelake-Server-v5'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Icelake-Server-v6'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Icelake-Server-v7'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='IvyBridge'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='IvyBridge-IBRS'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='IvyBridge-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='IvyBridge-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='KnightsMill'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-4fmaps'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-4vnniw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512er'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512pf'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ss'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='KnightsMill-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-4fmaps'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-4vnniw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512er'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512pf'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ss'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Opteron_G4'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fma4'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xop'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Opteron_G4-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fma4'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xop'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Opteron_G5'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fma4'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='tbm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xop'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Opteron_G5-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fma4'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='tbm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xop'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='SapphireRapids'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-int8'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-tile'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-fp16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='bus-lock-detect'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrc'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrs'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fzrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='serialize'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='tsx-ldtrk'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xfd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='SapphireRapids-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-int8'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-tile'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-fp16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='bus-lock-detect'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrc'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrs'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fzrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='serialize'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='tsx-ldtrk'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xfd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='SapphireRapids-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-int8'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-tile'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-fp16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='bus-lock-detect'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fbsdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrc'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrs'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fzrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='psdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='sbdr-ssdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='serialize'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='tsx-ldtrk'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xfd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='SapphireRapids-v3'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-int8'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='amx-tile'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-bf16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-fp16'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512-vpopcntdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bitalg'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vbmi2'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='bus-lock-detect'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='cldemote'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fbsdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrc'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrs'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fzrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='la57'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdir64b'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdiri'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='psdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='sbdr-ssdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='serialize'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ss'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='taa-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='tsx-ldtrk'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xfd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='SierraForest'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-ne-convert'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-vnni-int8'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='bus-lock-detect'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='cmpccxadd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fbsdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrs'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='mcdt-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pbrsb-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='psdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='sbdr-ssdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='serialize'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='SierraForest-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-ifma'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-ne-convert'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-vnni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx-vnni-int8'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='bus-lock-detect'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='cmpccxadd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fbsdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='fsrs'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ibrs-all'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='mcdt-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pbrsb-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='psdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='sbdr-ssdp-no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='serialize'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vaes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='vpclmulqdq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Client'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Client-IBRS'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Client-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Client-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Client-v3'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Client-v4'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Server'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Server-IBRS'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Server-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Server-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='hle'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='rtm'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Server-v3'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Server-v4'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Skylake-Server-v5'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512bw'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512cd'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512dq'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512f'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='avx512vl'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='invpcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pcid'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='pku'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Snowridge'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='cldemote'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='core-capability'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdir64b'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdiri'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='mpx'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='split-lock-detect'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Snowridge-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='cldemote'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='core-capability'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdir64b'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdiri'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='mpx'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='split-lock-detect'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Snowridge-v2'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='cldemote'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='core-capability'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdir64b'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdiri'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='split-lock-detect'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Snowridge-v3'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='cldemote'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='core-capability'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdir64b'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdiri'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='split-lock-detect'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='Snowridge-v4'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='cldemote'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='erms'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='gfni'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdir64b'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='movdiri'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='xsaves'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='athlon'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='3dnow'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='3dnowext'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='athlon-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='3dnow'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='3dnowext'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='core2duo'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ss'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='core2duo-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ss'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='coreduo'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ss'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='coreduo-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ss'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='n270'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ss'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='n270-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='ss'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='phenom'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='3dnow'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='3dnowext'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <blockers model='phenom-v1'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='3dnow'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <feature name='3dnowext'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </blockers>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </mode>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   </cpu>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   <memoryBacking supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <enum name='sourceType'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <value>file</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <value>anonymous</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <value>memfd</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   </memoryBacking>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   <devices>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <disk supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='diskDevice'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>disk</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>cdrom</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>floppy</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>lun</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='bus'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>fdc</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>scsi</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>virtio</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>usb</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>sata</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='model'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>virtio</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>virtio-transitional</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>virtio-non-transitional</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </disk>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <graphics supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='type'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>vnc</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>egl-headless</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>dbus</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </graphics>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <video supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='modelType'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>vga</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>cirrus</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>virtio</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>none</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>bochs</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>ramfb</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </video>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <hostdev supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='mode'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>subsystem</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='startupPolicy'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>default</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>mandatory</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>requisite</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>optional</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='subsysType'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>usb</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>pci</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>scsi</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='capsType'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='pciBackend'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </hostdev>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <rng supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='model'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>virtio</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>virtio-transitional</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>virtio-non-transitional</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='backendModel'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>random</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>egd</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>builtin</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </rng>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <filesystem supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='driverType'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>path</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>handle</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>virtiofs</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </filesystem>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <tpm supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='model'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>tpm-tis</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>tpm-crb</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='backendModel'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>emulator</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>external</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='backendVersion'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>2.0</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </tpm>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <redirdev supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='bus'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>usb</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </redirdev>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <channel supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='type'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>pty</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>unix</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </channel>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <crypto supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='model'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='type'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>qemu</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='backendModel'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>builtin</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </crypto>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <interface supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='backendType'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>default</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>passt</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </interface>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <panic supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='model'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>isa</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>hyperv</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </panic>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   </devices>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   <features>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <gic supported='no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <vmcoreinfo supported='yes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <genid supported='yes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <backingStoreInput supported='yes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <backup supported='yes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <async-teardown supported='yes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <ps2 supported='yes'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <sev supported='no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <sgx supported='no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <hyperv supported='yes'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       <enum name='features'>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>relaxed</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>vapic</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>spinlocks</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>vpindex</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>runtime</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>synic</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>stimer</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>reset</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>vendor_id</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>frequencies</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>reenlightenment</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>tlbflush</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>ipi</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>avic</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>emsr_bitmap</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:         <value>xmm_input</value>
Oct 08 19:00:59 compute-0 nova_compute[117514]:       </enum>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     </hyperv>
Oct 08 19:00:59 compute-0 nova_compute[117514]:     <launchSecurity supported='no'/>
Oct 08 19:00:59 compute-0 nova_compute[117514]:   </features>
Oct 08 19:00:59 compute-0 nova_compute[117514]: </domainCapabilities>
Oct 08 19:00:59 compute-0 nova_compute[117514]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.784 2 DEBUG nova.virt.libvirt.host [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.785 2 DEBUG nova.virt.libvirt.host [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.785 2 DEBUG nova.virt.libvirt.host [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.785 2 INFO nova.virt.libvirt.host [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Secure Boot support detected
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.788 2 INFO nova.virt.libvirt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.788 2 INFO nova.virt.libvirt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.798 2 DEBUG nova.virt.libvirt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.828 2 INFO nova.virt.node [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Determined node identity 8dadd82c-8ff0-43f1-888f-64abe8b5e349 from /var/lib/nova/compute_id
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.846 2 WARNING nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Compute nodes ['8dadd82c-8ff0-43f1-888f-64abe8b5e349'] for host compute-0.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.885 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.922 2 WARNING nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] No compute node record found for host compute-0.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.922 2 DEBUG oslo_concurrency.lockutils [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.922 2 DEBUG oslo_concurrency.lockutils [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.923 2 DEBUG oslo_concurrency.lockutils [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.923 2 DEBUG nova.compute.resource_tracker [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 08 19:00:59 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Oct 08 19:00:59 compute-0 systemd[1]: Started libvirt nodedev daemon.
Oct 08 19:01:00 compute-0 nova_compute[117514]: 2025-10-08 19:01:00.243 2 WARNING nova.virt.libvirt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 08 19:01:00 compute-0 nova_compute[117514]: 2025-10-08 19:01:00.243 2 DEBUG nova.compute.resource_tracker [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6524MB free_disk=73.64043045043945GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 08 19:01:00 compute-0 nova_compute[117514]: 2025-10-08 19:01:00.244 2 DEBUG oslo_concurrency.lockutils [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:01:00 compute-0 nova_compute[117514]: 2025-10-08 19:01:00.244 2 DEBUG oslo_concurrency.lockutils [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:01:00 compute-0 nova_compute[117514]: 2025-10-08 19:01:00.255 2 WARNING nova.compute.resource_tracker [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] No compute node record for compute-0.ctlplane.example.com:8dadd82c-8ff0-43f1-888f-64abe8b5e349: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 8dadd82c-8ff0-43f1-888f-64abe8b5e349 could not be found.
Oct 08 19:01:00 compute-0 nova_compute[117514]: 2025-10-08 19:01:00.268 2 INFO nova.compute.resource_tracker [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Compute node record created for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com with uuid: 8dadd82c-8ff0-43f1-888f-64abe8b5e349
Oct 08 19:01:00 compute-0 nova_compute[117514]: 2025-10-08 19:01:00.361 2 DEBUG nova.compute.resource_tracker [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 08 19:01:00 compute-0 nova_compute[117514]: 2025-10-08 19:01:00.361 2 DEBUG nova.compute.resource_tracker [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 08 19:01:00 compute-0 sshd-session[117461]: Failed password for root from 193.46.255.159 port 54464 ssh2
Oct 08 19:01:00 compute-0 unix_chkpwd[117846]: password check failed for user (root)
Oct 08 19:01:01 compute-0 CROND[117848]: (root) CMD (run-parts /etc/cron.hourly)
Oct 08 19:01:01 compute-0 run-parts[117851]: (/etc/cron.hourly) starting 0anacron
Oct 08 19:01:01 compute-0 run-parts[117857]: (/etc/cron.hourly) finished 0anacron
Oct 08 19:01:01 compute-0 CROND[117847]: (root) CMDEND (run-parts /etc/cron.hourly)
Oct 08 19:01:01 compute-0 nova_compute[117514]: 2025-10-08 19:01:01.470 2 INFO nova.scheduler.client.report [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [req-524900c9-b384-4f60-b315-b075558689a7] Created resource provider record via placement API for resource provider with UUID 8dadd82c-8ff0-43f1-888f-64abe8b5e349 and name compute-0.ctlplane.example.com.
Oct 08 19:01:01 compute-0 nova_compute[117514]: 2025-10-08 19:01:01.850 2 DEBUG nova.virt.libvirt.host [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Oct 08 19:01:01 compute-0 nova_compute[117514]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Oct 08 19:01:01 compute-0 nova_compute[117514]: 2025-10-08 19:01:01.851 2 INFO nova.virt.libvirt.host [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] kernel doesn't support AMD SEV
Oct 08 19:01:01 compute-0 nova_compute[117514]: 2025-10-08 19:01:01.852 2 DEBUG nova.compute.provider_tree [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Updating inventory in ProviderTree for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Oct 08 19:01:01 compute-0 nova_compute[117514]: 2025-10-08 19:01:01.852 2 DEBUG nova.virt.libvirt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Oct 08 19:01:01 compute-0 nova_compute[117514]: 2025-10-08 19:01:01.893 2 DEBUG nova.scheduler.client.report [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Updated inventory for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Oct 08 19:01:01 compute-0 nova_compute[117514]: 2025-10-08 19:01:01.893 2 DEBUG nova.compute.provider_tree [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Updating resource provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Oct 08 19:01:01 compute-0 nova_compute[117514]: 2025-10-08 19:01:01.893 2 DEBUG nova.compute.provider_tree [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Updating inventory in ProviderTree for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Oct 08 19:01:01 compute-0 nova_compute[117514]: 2025-10-08 19:01:01.991 2 DEBUG nova.compute.provider_tree [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Updating resource provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Oct 08 19:01:02 compute-0 nova_compute[117514]: 2025-10-08 19:01:02.013 2 DEBUG nova.compute.resource_tracker [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 08 19:01:02 compute-0 nova_compute[117514]: 2025-10-08 19:01:02.013 2 DEBUG oslo_concurrency.lockutils [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.770s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:01:02 compute-0 nova_compute[117514]: 2025-10-08 19:01:02.013 2 DEBUG nova.service [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Oct 08 19:01:02 compute-0 nova_compute[117514]: 2025-10-08 19:01:02.099 2 DEBUG nova.service [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Oct 08 19:01:02 compute-0 nova_compute[117514]: 2025-10-08 19:01:02.100 2 DEBUG nova.servicegroup.drivers.db [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] DB_Driver: join new ServiceGroup member compute-0.ctlplane.example.com to the compute group, service = <Service: host=compute-0.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Oct 08 19:01:03 compute-0 sshd-session[117461]: Failed password for root from 193.46.255.159 port 54464 ssh2
Oct 08 19:01:03 compute-0 sshd-session[117461]: Received disconnect from 193.46.255.159 port 54464:11:  [preauth]
Oct 08 19:01:03 compute-0 sshd-session[117461]: Disconnected from authenticating user root 193.46.255.159 port 54464 [preauth]
Oct 08 19:01:03 compute-0 sshd-session[117461]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.159  user=root
Oct 08 19:01:03 compute-0 sshd-session[117858]: Accepted publickey for zuul from 192.168.122.30 port 58912 ssh2: ECDSA SHA256:i+73Mx2Y/ukt1b+huf+9w+ftZalnyybbDU6glTR0JfU
Oct 08 19:01:03 compute-0 systemd-logind[844]: New session 11 of user zuul.
Oct 08 19:01:03 compute-0 systemd[1]: Started Session 11 of User zuul.
Oct 08 19:01:03 compute-0 sshd-session[117858]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 08 19:01:04 compute-0 unix_chkpwd[117987]: password check failed for user (root)
Oct 08 19:01:04 compute-0 sshd-session[117861]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.159  user=root
Oct 08 19:01:04 compute-0 python3.9[118014]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 08 19:01:05 compute-0 auditd[775]: Audit daemon rotating log files
Oct 08 19:01:05 compute-0 sshd-session[117861]: Failed password for root from 193.46.255.159 port 15092 ssh2
Oct 08 19:01:05 compute-0 sudo[118168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orbkakqzqwvrxoratnvtckwylpdcuqdf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950065.2504535-36-158217307770593/AnsiballZ_systemd_service.py'
Oct 08 19:01:05 compute-0 sudo[118168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:01:06 compute-0 python3.9[118170]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 08 19:01:06 compute-0 systemd[1]: Reloading.
Oct 08 19:01:06 compute-0 systemd-rc-local-generator[118194]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 19:01:06 compute-0 systemd-sysv-generator[118199]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 19:01:06 compute-0 sudo[118168]: pam_unix(sudo:session): session closed for user root
Oct 08 19:01:06 compute-0 unix_chkpwd[118237]: password check failed for user (root)
Oct 08 19:01:07 compute-0 python3.9[118356]: ansible-ansible.builtin.service_facts Invoked
Oct 08 19:01:07 compute-0 network[118373]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 08 19:01:07 compute-0 network[118374]: 'network-scripts' will be removed from distribution in near future.
Oct 08 19:01:07 compute-0 network[118375]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 08 19:01:09 compute-0 sshd-session[117861]: Failed password for root from 193.46.255.159 port 15092 ssh2
Oct 08 19:01:09 compute-0 unix_chkpwd[118419]: password check failed for user (root)
Oct 08 19:01:10 compute-0 sshd-session[117861]: Failed password for root from 193.46.255.159 port 15092 ssh2
Oct 08 19:01:11 compute-0 sshd-session[117861]: Received disconnect from 193.46.255.159 port 15092:11:  [preauth]
Oct 08 19:01:11 compute-0 sshd-session[117861]: Disconnected from authenticating user root 193.46.255.159 port 15092 [preauth]
Oct 08 19:01:11 compute-0 sshd-session[117861]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.159  user=root
Oct 08 19:01:12 compute-0 podman[118526]: 2025-10-08 19:01:12.685814595 +0000 UTC m=+0.095871048 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Oct 08 19:01:12 compute-0 sudo[118670]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tsghesxoctdkiynzoxxxudrbtwggciyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950072.6249118-55-274781633011673/AnsiballZ_systemd_service.py'
Oct 08 19:01:12 compute-0 sudo[118670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:01:13 compute-0 python3.9[118672]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 08 19:01:13 compute-0 sudo[118670]: pam_unix(sudo:session): session closed for user root
Oct 08 19:01:14 compute-0 sudo[118823]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ablphzozglenvuetarpvupsxjxmdzqnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950073.6212878-65-129849739887111/AnsiballZ_file.py'
Oct 08 19:01:14 compute-0 sudo[118823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:01:14 compute-0 python3.9[118825]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:01:14 compute-0 sudo[118823]: pam_unix(sudo:session): session closed for user root
Oct 08 19:01:14 compute-0 sudo[118975]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkqeetxxttfidvqzvgrmnkgusgvaqvew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950074.5121949-73-163379079662606/AnsiballZ_file.py'
Oct 08 19:01:14 compute-0 sudo[118975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:01:15 compute-0 python3.9[118977]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:01:15 compute-0 rsyslogd[1288]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 08 19:01:15 compute-0 rsyslogd[1288]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 08 19:01:15 compute-0 rsyslogd[1288]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 08 19:01:15 compute-0 sudo[118975]: pam_unix(sudo:session): session closed for user root
Oct 08 19:01:15 compute-0 sudo[119128]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-glglvwnakkfserzeatxjizsrfaxqslkh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950075.2738907-82-219982780994808/AnsiballZ_command.py'
Oct 08 19:01:15 compute-0 sudo[119128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:01:15 compute-0 python3.9[119130]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 08 19:01:15 compute-0 sudo[119128]: pam_unix(sudo:session): session closed for user root
Oct 08 19:01:16 compute-0 python3.9[119282]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct 08 19:01:17 compute-0 sudo[119432]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnlxbzdtxrauqaxbswttkxdqrumsyvyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950077.092114-100-216150850564633/AnsiballZ_systemd_service.py'
Oct 08 19:01:17 compute-0 sudo[119432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:01:17 compute-0 python3.9[119434]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 08 19:01:17 compute-0 systemd[1]: Reloading.
Oct 08 19:01:17 compute-0 systemd-rc-local-generator[119461]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 19:01:17 compute-0 systemd-sysv-generator[119464]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 19:01:18 compute-0 sudo[119432]: pam_unix(sudo:session): session closed for user root
Oct 08 19:01:18 compute-0 nova_compute[117514]: 2025-10-08 19:01:18.102 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:01:18 compute-0 nova_compute[117514]: 2025-10-08 19:01:18.137 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:01:18 compute-0 sudo[119637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scydmbrlvepiunzrqsmfpdyulrsancoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950078.2641437-108-11038952611947/AnsiballZ_command.py'
Oct 08 19:01:18 compute-0 podman[119594]: 2025-10-08 19:01:18.657211402 +0000 UTC m=+0.084215974 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct 08 19:01:18 compute-0 sudo[119637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:01:18 compute-0 python3.9[119642]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 08 19:01:18 compute-0 sudo[119637]: pam_unix(sudo:session): session closed for user root
Oct 08 19:01:19 compute-0 podman[119767]: 2025-10-08 19:01:19.498677312 +0000 UTC m=+0.066400713 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 08 19:01:19 compute-0 sudo[119810]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zazyyclkfxalmnfzgjzeagohxpsyzczv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950079.1525984-117-142488037072152/AnsiballZ_file.py'
Oct 08 19:01:19 compute-0 sudo[119810]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:01:19 compute-0 python3.9[119814]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 08 19:01:19 compute-0 sudo[119810]: pam_unix(sudo:session): session closed for user root
Oct 08 19:01:20 compute-0 podman[119938]: 2025-10-08 19:01:20.538019443 +0000 UTC m=+0.138839009 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct 08 19:01:20 compute-0 python3.9[119977]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 08 19:01:21 compute-0 python3.9[120142]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 19:01:22 compute-0 python3.9[120263]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759950080.9010713-133-277864867710341/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=e86e0e43000ce9ccfe5aefbf8e8f2e3d15d05584 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 08 19:01:22 compute-0 sudo[120413]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skbyhkpvyaipgmbhnfkcngxmscyfgyvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950082.4219627-148-38477601952035/AnsiballZ_group.py'
Oct 08 19:01:22 compute-0 sudo[120413]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:01:23 compute-0 python3.9[120415]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Oct 08 19:01:23 compute-0 sudo[120413]: pam_unix(sudo:session): session closed for user root
Oct 08 19:01:23 compute-0 sudo[120565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djjhovaoafihmvgajxblxlbnzubqcwnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950083.482406-159-53597266221489/AnsiballZ_getent.py'
Oct 08 19:01:23 compute-0 sudo[120565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:01:24 compute-0 python3.9[120567]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Oct 08 19:01:24 compute-0 sudo[120565]: pam_unix(sudo:session): session closed for user root
Oct 08 19:01:24 compute-0 sudo[120718]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkvsefxezawavzijmltleparomxdqxqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950084.4022408-167-262298741982343/AnsiballZ_group.py'
Oct 08 19:01:24 compute-0 sudo[120718]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:01:24 compute-0 python3.9[120720]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct 08 19:01:24 compute-0 groupadd[120721]: group added to /etc/group: name=ceilometer, GID=42405
Oct 08 19:01:24 compute-0 groupadd[120721]: group added to /etc/gshadow: name=ceilometer
Oct 08 19:01:24 compute-0 groupadd[120721]: new group: name=ceilometer, GID=42405
Oct 08 19:01:25 compute-0 sudo[120718]: pam_unix(sudo:session): session closed for user root
Oct 08 19:01:25 compute-0 sudo[120876]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbtyotxlxvxbfjrwjdzkcjummjpllorl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950085.2324276-175-240907906328936/AnsiballZ_user.py'
Oct 08 19:01:25 compute-0 sudo[120876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:01:26 compute-0 python3.9[120878]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct 08 19:01:26 compute-0 useradd[120880]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/0
Oct 08 19:01:26 compute-0 useradd[120880]: add 'ceilometer' to group 'libvirt'
Oct 08 19:01:26 compute-0 useradd[120880]: add 'ceilometer' to shadow group 'libvirt'
Oct 08 19:01:26 compute-0 sudo[120876]: pam_unix(sudo:session): session closed for user root
Oct 08 19:01:27 compute-0 python3.9[121036]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 19:01:27 compute-0 python3.9[121157]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1759950086.776381-201-105578576111959/.source.conf _original_basename=ceilometer.conf follow=False checksum=f74f01c63e6cdeca5458ef9aff2a1db5d6a4e4b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:01:28 compute-0 python3.9[121307]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 19:01:29 compute-0 python3.9[121428]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1759950088.1281173-201-132609462567754/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:01:30 compute-0 python3.9[121578]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 19:01:30 compute-0 python3.9[121699]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1759950089.5040212-201-19806015132249/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:01:31 compute-0 python3.9[121849]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 08 19:01:32 compute-0 python3.9[122001]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 08 19:01:32 compute-0 python3.9[122153]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 19:01:33 compute-0 python3.9[122274]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759950092.401349-260-234548934646737/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:01:34 compute-0 python3.9[122424]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 19:01:34 compute-0 python3.9[122500]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:01:35 compute-0 python3.9[122650]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 19:01:36 compute-0 python3.9[122771]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759950094.9570293-260-38171448417766/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=6747f2067b9284624d06fbad47fbd56de1e9892c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:01:36 compute-0 python3.9[122921]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 19:01:37 compute-0 python3.9[123042]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759950096.2232506-260-55177451141639/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:01:38 compute-0 python3.9[123192]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 19:01:38 compute-0 python3.9[123313]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759950097.5152826-260-84171440704575/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:01:39 compute-0 python3.9[123463]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 19:01:39 compute-0 python3.9[123584]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759950098.84073-260-166098265098715/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=3820eb6e48c35431ebf53228213a5d51b7591223 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:01:40 compute-0 sshd-session[123661]: banner exchange: Connection from 148.113.193.79 port 54892: invalid format
Oct 08 19:01:40 compute-0 python3.9[123735]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 19:01:41 compute-0 python3.9[123856]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759950100.127853-260-253228229858692/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:01:41 compute-0 python3.9[124006]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 19:01:42 compute-0 python3.9[124127]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759950101.361017-260-280697138710755/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=33df3bf08923ad9105770f5abb51d4cde791931a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:01:42 compute-0 podman[124251]: 2025-10-08 19:01:42.943652628 +0000 UTC m=+0.061486843 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 08 19:01:43 compute-0 python3.9[124292]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 19:01:43 compute-0 python3.9[124418]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759950102.61923-260-110516510466088/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=2b6bd0891e609bf38a73282f42888052b750bed6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:01:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:01:44.220 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:01:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:01:44.221 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:01:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:01:44.221 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:01:44 compute-0 python3.9[124568]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 19:01:44 compute-0 python3.9[124689]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759950103.8702881-260-216314575236660/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=8bed8129af2c9145e8d37569bb493c0de1895d6f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:01:45 compute-0 python3.9[124839]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 19:01:46 compute-0 python3.9[124960]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759950105.1166286-260-225012113742851/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:01:47 compute-0 python3.9[125110]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 19:01:47 compute-0 python3.9[125186]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/node_exporter.yaml _original_basename=node_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/node_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:01:48 compute-0 python3.9[125336]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 19:01:48 compute-0 python3.9[125412]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml _original_basename=podman_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/podman_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:01:48 compute-0 podman[125413]: 2025-10-08 19:01:48.99684043 +0000 UTC m=+0.086904631 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct 08 19:01:49 compute-0 podman[125583]: 2025-10-08 19:01:49.646125153 +0000 UTC m=+0.059698082 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Oct 08 19:01:49 compute-0 python3.9[125582]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 19:01:50 compute-0 python3.9[125677]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml _original_basename=ceilometer_prom_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:01:50 compute-0 sudo[125841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvucikjlylxujllxterglnpolgdbmpcv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950110.3663747-449-191988443988265/AnsiballZ_file.py'
Oct 08 19:01:50 compute-0 sudo[125841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:01:50 compute-0 podman[125801]: 2025-10-08 19:01:50.799170651 +0000 UTC m=+0.134366091 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 08 19:01:50 compute-0 python3.9[125849]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:01:50 compute-0 sudo[125841]: pam_unix(sudo:session): session closed for user root
Oct 08 19:01:51 compute-0 sudo[126006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwkowrkgnavstgdirlxwcaelyzlpwpfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950111.1389837-457-132913672800855/AnsiballZ_file.py'
Oct 08 19:01:51 compute-0 sudo[126006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:01:51 compute-0 python3.9[126008]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:01:51 compute-0 sudo[126006]: pam_unix(sudo:session): session closed for user root
Oct 08 19:01:52 compute-0 sudo[126158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkqxxamfjcawdfmombkrseglclfcgdyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950111.9306254-465-24567046495143/AnsiballZ_file.py'
Oct 08 19:01:52 compute-0 sudo[126158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:01:52 compute-0 python3.9[126160]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 08 19:01:52 compute-0 sudo[126158]: pam_unix(sudo:session): session closed for user root
Oct 08 19:01:53 compute-0 sudo[126310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilxewjsonaksstuphunemwvqlupusmep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950112.6630085-473-247899857712560/AnsiballZ_systemd_service.py'
Oct 08 19:01:53 compute-0 sudo[126310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:01:53 compute-0 python3.9[126312]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 08 19:01:53 compute-0 systemd[1]: Reloading.
Oct 08 19:01:53 compute-0 systemd-sysv-generator[126346]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 19:01:53 compute-0 systemd-rc-local-generator[126343]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 19:01:53 compute-0 systemd[1]: Listening on Podman API Socket.
Oct 08 19:01:53 compute-0 sudo[126310]: pam_unix(sudo:session): session closed for user root
Oct 08 19:01:54 compute-0 sudo[126502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgquctdfztygerywvdxzjhcolihwlbqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950114.1005082-482-101389055198168/AnsiballZ_stat.py'
Oct 08 19:01:54 compute-0 sudo[126502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:01:54 compute-0 python3.9[126504]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 19:01:54 compute-0 sudo[126502]: pam_unix(sudo:session): session closed for user root
Oct 08 19:01:55 compute-0 sudo[126625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftzfutffhhwuoohhsstpmkqjbkbvlage ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950114.1005082-482-101389055198168/AnsiballZ_copy.py'
Oct 08 19:01:55 compute-0 sudo[126625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:01:55 compute-0 python3.9[126627]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759950114.1005082-482-101389055198168/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 08 19:01:55 compute-0 sudo[126625]: pam_unix(sudo:session): session closed for user root
Oct 08 19:01:55 compute-0 sudo[126701]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngzfaeafzmybdotyydgozchwjbwalbbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950114.1005082-482-101389055198168/AnsiballZ_stat.py'
Oct 08 19:01:55 compute-0 sudo[126701]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:01:55 compute-0 python3.9[126703]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 19:01:55 compute-0 sudo[126701]: pam_unix(sudo:session): session closed for user root
Oct 08 19:01:56 compute-0 sudo[126824]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-undeiwbjajvuunxkovokhetimhytqvdk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950114.1005082-482-101389055198168/AnsiballZ_copy.py'
Oct 08 19:01:56 compute-0 sudo[126824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:01:56 compute-0 python3.9[126826]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759950114.1005082-482-101389055198168/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 08 19:01:56 compute-0 sudo[126824]: pam_unix(sudo:session): session closed for user root
Oct 08 19:01:57 compute-0 sudo[126976]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-buvlleykixefwsunffjutuwzbikwepts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950116.7430124-510-17150114664647/AnsiballZ_container_config_data.py'
Oct 08 19:01:57 compute-0 sudo[126976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:01:57 compute-0 python3.9[126978]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=ceilometer_agent_compute.json debug=False
Oct 08 19:01:57 compute-0 sudo[126976]: pam_unix(sudo:session): session closed for user root
Oct 08 19:01:58 compute-0 sudo[127128]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytmrxuwyurwjdurdqhxcltadclrrwwue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950117.7527814-519-183030551042810/AnsiballZ_container_config_hash.py'
Oct 08 19:01:58 compute-0 sudo[127128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:01:58 compute-0 python3.9[127130]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 08 19:01:58 compute-0 sudo[127128]: pam_unix(sudo:session): session closed for user root
Oct 08 19:01:58 compute-0 nova_compute[117514]: 2025-10-08 19:01:58.719 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:01:58 compute-0 nova_compute[117514]: 2025-10-08 19:01:58.720 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:01:58 compute-0 nova_compute[117514]: 2025-10-08 19:01:58.721 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 08 19:01:58 compute-0 nova_compute[117514]: 2025-10-08 19:01:58.721 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 08 19:01:58 compute-0 nova_compute[117514]: 2025-10-08 19:01:58.769 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 08 19:01:58 compute-0 nova_compute[117514]: 2025-10-08 19:01:58.769 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:01:58 compute-0 nova_compute[117514]: 2025-10-08 19:01:58.770 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:01:58 compute-0 nova_compute[117514]: 2025-10-08 19:01:58.770 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:01:58 compute-0 nova_compute[117514]: 2025-10-08 19:01:58.771 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:01:58 compute-0 nova_compute[117514]: 2025-10-08 19:01:58.771 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:01:58 compute-0 nova_compute[117514]: 2025-10-08 19:01:58.772 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:01:58 compute-0 nova_compute[117514]: 2025-10-08 19:01:58.772 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 08 19:01:58 compute-0 nova_compute[117514]: 2025-10-08 19:01:58.773 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:01:58 compute-0 nova_compute[117514]: 2025-10-08 19:01:58.801 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:01:58 compute-0 nova_compute[117514]: 2025-10-08 19:01:58.802 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:01:58 compute-0 nova_compute[117514]: 2025-10-08 19:01:58.802 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:01:58 compute-0 nova_compute[117514]: 2025-10-08 19:01:58.803 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 08 19:01:59 compute-0 nova_compute[117514]: 2025-10-08 19:01:59.006 2 WARNING nova.virt.libvirt.driver [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 08 19:01:59 compute-0 nova_compute[117514]: 2025-10-08 19:01:59.007 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6529MB free_disk=73.6331787109375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 08 19:01:59 compute-0 nova_compute[117514]: 2025-10-08 19:01:59.007 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:01:59 compute-0 nova_compute[117514]: 2025-10-08 19:01:59.008 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:01:59 compute-0 nova_compute[117514]: 2025-10-08 19:01:59.065 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 08 19:01:59 compute-0 nova_compute[117514]: 2025-10-08 19:01:59.066 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 08 19:01:59 compute-0 nova_compute[117514]: 2025-10-08 19:01:59.097 2 DEBUG nova.compute.provider_tree [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 08 19:01:59 compute-0 nova_compute[117514]: 2025-10-08 19:01:59.110 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 08 19:01:59 compute-0 nova_compute[117514]: 2025-10-08 19:01:59.112 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 08 19:01:59 compute-0 nova_compute[117514]: 2025-10-08 19:01:59.112 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:01:59 compute-0 sudo[127280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iphmjuijjxvxgkhhodxqvuukemhyzqic ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759950118.8128228-529-262477655553499/AnsiballZ_edpm_container_manage.py'
Oct 08 19:01:59 compute-0 sudo[127280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:01:59 compute-0 python3[127282]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=ceilometer_agent_compute.json log_base_path=/var/log/containers/stdouts debug=False
Oct 08 19:01:59 compute-0 podman[127319]: 2025-10-08 19:01:59.9479515 +0000 UTC m=+0.057012564 container create e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute)
Oct 08 19:01:59 compute-0 podman[127319]: 2025-10-08 19:01:59.916845229 +0000 UTC m=+0.025906283 image pull 5397cd841d80292a5786d82cb8a2bcd574988efb08c605ba6eaaa59d6f646815 quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189
Oct 08 19:01:59 compute-0 python3[127282]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck compute --label config_id=edpm --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z --volume /var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189 kolla_start
Oct 08 19:02:00 compute-0 sudo[127280]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:00 compute-0 sudo[127506]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdlcsmstkghwxhntfrpltasklvmakznw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950120.3248272-537-190674488641905/AnsiballZ_stat.py'
Oct 08 19:02:00 compute-0 sudo[127506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:00 compute-0 python3.9[127508]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 08 19:02:00 compute-0 sudo[127506]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:01 compute-0 sudo[127660]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxiunnvqmrjdlfqizlgnprxkukdhlpwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950121.2216432-546-147923241756365/AnsiballZ_file.py'
Oct 08 19:02:01 compute-0 sudo[127660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:01 compute-0 python3.9[127662]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:02:01 compute-0 sudo[127660]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:02 compute-0 sudo[127811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndtingahmiwygipmqtdznlbsybeoktfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950121.8551555-546-74308157265001/AnsiballZ_copy.py'
Oct 08 19:02:02 compute-0 sudo[127811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:02 compute-0 python3.9[127813]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759950121.8551555-546-74308157265001/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:02:02 compute-0 sudo[127811]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:03 compute-0 sudo[127887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yspitlxyyegsabxwubjdoflmeuenfnvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950121.8551555-546-74308157265001/AnsiballZ_systemd.py'
Oct 08 19:02:03 compute-0 sudo[127887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:03 compute-0 python3.9[127889]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 08 19:02:03 compute-0 systemd[1]: Reloading.
Oct 08 19:02:03 compute-0 systemd-rc-local-generator[127910]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 19:02:03 compute-0 systemd-sysv-generator[127918]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 19:02:04 compute-0 sudo[127887]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:04 compute-0 sudo[127998]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lielebysevysdtiwhllcgniggehxknaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950121.8551555-546-74308157265001/AnsiballZ_systemd.py'
Oct 08 19:02:04 compute-0 sudo[127998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:04 compute-0 python3.9[128000]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 08 19:02:04 compute-0 systemd[1]: Reloading.
Oct 08 19:02:04 compute-0 systemd-rc-local-generator[128029]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 19:02:04 compute-0 systemd-sysv-generator[128032]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 19:02:05 compute-0 systemd[1]: Starting ceilometer_agent_compute container...
Oct 08 19:02:05 compute-0 systemd[1]: Started libcrun container.
Oct 08 19:02:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0209ee27825f3f77b76d9449285ecee2d0c8c3ef3bf4e58b622139dc871d8590/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Oct 08 19:02:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0209ee27825f3f77b76d9449285ecee2d0c8c3ef3bf4e58b622139dc871d8590/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct 08 19:02:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0209ee27825f3f77b76d9449285ecee2d0c8c3ef3bf4e58b622139dc871d8590/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Oct 08 19:02:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0209ee27825f3f77b76d9449285ecee2d0c8c3ef3bf4e58b622139dc871d8590/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Oct 08 19:02:05 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f.
Oct 08 19:02:05 compute-0 podman[128040]: 2025-10-08 19:02:05.274366285 +0000 UTC m=+0.158798209 container init e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3)
Oct 08 19:02:05 compute-0 ceilometer_agent_compute[128055]: + sudo -E kolla_set_configs
Oct 08 19:02:05 compute-0 podman[128040]: 2025-10-08 19:02:05.309051018 +0000 UTC m=+0.193482892 container start e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team)
Oct 08 19:02:05 compute-0 podman[128040]: ceilometer_agent_compute
Oct 08 19:02:05 compute-0 systemd[1]: Started ceilometer_agent_compute container.
Oct 08 19:02:05 compute-0 sudo[128062]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Oct 08 19:02:05 compute-0 rsyslogd[1288]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 08 19:02:05 compute-0 ceilometer_agent_compute[128055]: sudo: unable to send audit message: Operation not permitted
Oct 08 19:02:05 compute-0 sudo[128062]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Oct 08 19:02:05 compute-0 sudo[128062]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 08 19:02:05 compute-0 sudo[127998]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:05 compute-0 podman[128061]: 2025-10-08 19:02:05.376227932 +0000 UTC m=+0.058904388 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 08 19:02:05 compute-0 systemd[1]: e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f-52b4da775efe27cc.service: Main process exited, code=exited, status=1/FAILURE
Oct 08 19:02:05 compute-0 systemd[1]: e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f-52b4da775efe27cc.service: Failed with result 'exit-code'.
Oct 08 19:02:05 compute-0 ceilometer_agent_compute[128055]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 08 19:02:05 compute-0 ceilometer_agent_compute[128055]: INFO:__main__:Validating config file
Oct 08 19:02:05 compute-0 ceilometer_agent_compute[128055]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 08 19:02:05 compute-0 ceilometer_agent_compute[128055]: INFO:__main__:Copying service configuration files
Oct 08 19:02:05 compute-0 ceilometer_agent_compute[128055]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Oct 08 19:02:05 compute-0 ceilometer_agent_compute[128055]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Oct 08 19:02:05 compute-0 ceilometer_agent_compute[128055]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Oct 08 19:02:05 compute-0 ceilometer_agent_compute[128055]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Oct 08 19:02:05 compute-0 ceilometer_agent_compute[128055]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Oct 08 19:02:05 compute-0 ceilometer_agent_compute[128055]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Oct 08 19:02:05 compute-0 ceilometer_agent_compute[128055]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Oct 08 19:02:05 compute-0 ceilometer_agent_compute[128055]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Oct 08 19:02:05 compute-0 ceilometer_agent_compute[128055]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Oct 08 19:02:05 compute-0 ceilometer_agent_compute[128055]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Oct 08 19:02:05 compute-0 ceilometer_agent_compute[128055]: INFO:__main__:Writing out command to execute
Oct 08 19:02:05 compute-0 sudo[128062]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:05 compute-0 ceilometer_agent_compute[128055]: ++ cat /run_command
Oct 08 19:02:05 compute-0 ceilometer_agent_compute[128055]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Oct 08 19:02:05 compute-0 ceilometer_agent_compute[128055]: + ARGS=
Oct 08 19:02:05 compute-0 ceilometer_agent_compute[128055]: + sudo kolla_copy_cacerts
Oct 08 19:02:05 compute-0 ceilometer_agent_compute[128055]: sudo: unable to send audit message: Operation not permitted
Oct 08 19:02:05 compute-0 sudo[128107]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Oct 08 19:02:05 compute-0 sudo[128107]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Oct 08 19:02:05 compute-0 sudo[128107]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 08 19:02:05 compute-0 sudo[128107]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:05 compute-0 ceilometer_agent_compute[128055]: + [[ ! -n '' ]]
Oct 08 19:02:05 compute-0 ceilometer_agent_compute[128055]: + . kolla_extend_start
Oct 08 19:02:05 compute-0 ceilometer_agent_compute[128055]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Oct 08 19:02:05 compute-0 ceilometer_agent_compute[128055]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Oct 08 19:02:05 compute-0 ceilometer_agent_compute[128055]: + umask 0022
Oct 08 19:02:05 compute-0 ceilometer_agent_compute[128055]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Oct 08 19:02:05 compute-0 sudo[128238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njreqihmeyvvzcprronpacazhkfdvywo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950125.5649796-570-41414720863828/AnsiballZ_systemd.py'
Oct 08 19:02:05 compute-0 sudo[128238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:06 compute-0 python3.9[128240]: ansible-ansible.builtin.systemd Invoked with name=edpm_ceilometer_agent_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 08 19:02:06 compute-0 systemd[1]: Stopping ceilometer_agent_compute container...
Oct 08 19:02:06 compute-0 systemd[1]: libpod-e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f.scope: Deactivated successfully.
Oct 08 19:02:06 compute-0 podman[128244]: 2025-10-08 19:02:06.352771451 +0000 UTC m=+0.041176500 container died e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 08 19:02:06 compute-0 systemd[1]: e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f-52b4da775efe27cc.timer: Deactivated successfully.
Oct 08 19:02:06 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f.
Oct 08 19:02:06 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f-userdata-shm.mount: Deactivated successfully.
Oct 08 19:02:06 compute-0 systemd[1]: var-lib-containers-storage-overlay-0209ee27825f3f77b76d9449285ecee2d0c8c3ef3bf4e58b622139dc871d8590-merged.mount: Deactivated successfully.
Oct 08 19:02:06 compute-0 podman[128244]: 2025-10-08 19:02:06.433313738 +0000 UTC m=+0.121718787 container cleanup e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct 08 19:02:06 compute-0 podman[128244]: ceilometer_agent_compute
Oct 08 19:02:06 compute-0 podman[128274]: ceilometer_agent_compute
Oct 08 19:02:06 compute-0 systemd[1]: edpm_ceilometer_agent_compute.service: Deactivated successfully.
Oct 08 19:02:06 compute-0 systemd[1]: Stopped ceilometer_agent_compute container.
Oct 08 19:02:06 compute-0 systemd[1]: Starting ceilometer_agent_compute container...
Oct 08 19:02:06 compute-0 systemd[1]: Started libcrun container.
Oct 08 19:02:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0209ee27825f3f77b76d9449285ecee2d0c8c3ef3bf4e58b622139dc871d8590/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Oct 08 19:02:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0209ee27825f3f77b76d9449285ecee2d0c8c3ef3bf4e58b622139dc871d8590/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct 08 19:02:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0209ee27825f3f77b76d9449285ecee2d0c8c3ef3bf4e58b622139dc871d8590/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Oct 08 19:02:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0209ee27825f3f77b76d9449285ecee2d0c8c3ef3bf4e58b622139dc871d8590/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Oct 08 19:02:06 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f.
Oct 08 19:02:06 compute-0 podman[128287]: 2025-10-08 19:02:06.683115022 +0000 UTC m=+0.155092893 container init e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct 08 19:02:06 compute-0 ceilometer_agent_compute[128303]: + sudo -E kolla_set_configs
Oct 08 19:02:06 compute-0 ceilometer_agent_compute[128303]: sudo: unable to send audit message: Operation not permitted
Oct 08 19:02:06 compute-0 sudo[128309]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Oct 08 19:02:06 compute-0 sudo[128309]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Oct 08 19:02:06 compute-0 sudo[128309]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 08 19:02:06 compute-0 podman[128287]: 2025-10-08 19:02:06.726984719 +0000 UTC m=+0.198962590 container start e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct 08 19:02:06 compute-0 podman[128287]: ceilometer_agent_compute
Oct 08 19:02:06 compute-0 systemd[1]: Started ceilometer_agent_compute container.
Oct 08 19:02:06 compute-0 ceilometer_agent_compute[128303]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 08 19:02:06 compute-0 ceilometer_agent_compute[128303]: INFO:__main__:Validating config file
Oct 08 19:02:06 compute-0 ceilometer_agent_compute[128303]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 08 19:02:06 compute-0 ceilometer_agent_compute[128303]: INFO:__main__:Copying service configuration files
Oct 08 19:02:06 compute-0 ceilometer_agent_compute[128303]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Oct 08 19:02:06 compute-0 ceilometer_agent_compute[128303]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Oct 08 19:02:06 compute-0 ceilometer_agent_compute[128303]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Oct 08 19:02:06 compute-0 ceilometer_agent_compute[128303]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Oct 08 19:02:06 compute-0 ceilometer_agent_compute[128303]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Oct 08 19:02:06 compute-0 ceilometer_agent_compute[128303]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Oct 08 19:02:06 compute-0 ceilometer_agent_compute[128303]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Oct 08 19:02:06 compute-0 ceilometer_agent_compute[128303]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Oct 08 19:02:06 compute-0 ceilometer_agent_compute[128303]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Oct 08 19:02:06 compute-0 sudo[128238]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:06 compute-0 ceilometer_agent_compute[128303]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Oct 08 19:02:06 compute-0 ceilometer_agent_compute[128303]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Oct 08 19:02:06 compute-0 ceilometer_agent_compute[128303]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Oct 08 19:02:06 compute-0 ceilometer_agent_compute[128303]: INFO:__main__:Writing out command to execute
Oct 08 19:02:06 compute-0 sudo[128309]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:06 compute-0 ceilometer_agent_compute[128303]: ++ cat /run_command
Oct 08 19:02:06 compute-0 ceilometer_agent_compute[128303]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Oct 08 19:02:06 compute-0 ceilometer_agent_compute[128303]: + ARGS=
Oct 08 19:02:06 compute-0 ceilometer_agent_compute[128303]: + sudo kolla_copy_cacerts
Oct 08 19:02:06 compute-0 podman[128310]: 2025-10-08 19:02:06.817834371 +0000 UTC m=+0.073486206 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct 08 19:02:06 compute-0 systemd[1]: e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f-39eec91179f1428e.service: Main process exited, code=exited, status=1/FAILURE
Oct 08 19:02:06 compute-0 systemd[1]: e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f-39eec91179f1428e.service: Failed with result 'exit-code'.
Oct 08 19:02:06 compute-0 sudo[128331]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Oct 08 19:02:06 compute-0 ceilometer_agent_compute[128303]: sudo: unable to send audit message: Operation not permitted
Oct 08 19:02:06 compute-0 sudo[128331]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Oct 08 19:02:06 compute-0 sudo[128331]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Oct 08 19:02:06 compute-0 sudo[128331]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:06 compute-0 ceilometer_agent_compute[128303]: + [[ ! -n '' ]]
Oct 08 19:02:06 compute-0 ceilometer_agent_compute[128303]: + . kolla_extend_start
Oct 08 19:02:06 compute-0 ceilometer_agent_compute[128303]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Oct 08 19:02:06 compute-0 ceilometer_agent_compute[128303]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Oct 08 19:02:06 compute-0 ceilometer_agent_compute[128303]: + umask 0022
Oct 08 19:02:06 compute-0 ceilometer_agent_compute[128303]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Oct 08 19:02:07 compute-0 sudo[128485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywhiwljigjoxunrfxuselgxizxrpwpdz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950126.9605684-578-110074488624178/AnsiballZ_stat.py'
Oct 08 19:02:07 compute-0 sudo[128485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:07 compute-0 python3.9[128487]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 19:02:07 compute-0 sudo[128485]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.903 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.903 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.903 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.904 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.904 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.904 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.905 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.905 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.905 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.905 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.905 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.906 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.906 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.906 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.906 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.907 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.907 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.907 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.907 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.907 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.908 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.908 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.908 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.908 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.908 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.909 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.909 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.909 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.909 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.910 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.910 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.910 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.910 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.911 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.911 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.911 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.911 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.911 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.912 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.912 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.912 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.912 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.912 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.913 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.913 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.913 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.913 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.913 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.914 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.914 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.914 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.914 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.915 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.915 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.915 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.915 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.915 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.916 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.916 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.916 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.916 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.916 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.917 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.917 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.917 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.917 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.917 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.918 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.918 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.918 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.918 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.918 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.919 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.919 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.919 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.919 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.919 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.920 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.920 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.920 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.920 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.921 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.921 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.921 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.921 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.921 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.921 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.921 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.922 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.922 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.922 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.922 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.922 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.922 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.922 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.922 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.922 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.922 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.922 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.923 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.923 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.923 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.923 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.923 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.923 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.923 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.923 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.923 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.923 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.924 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.924 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.924 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.924 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.924 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.924 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.924 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.924 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.924 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.925 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.925 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.925 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.925 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.925 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.925 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.925 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.925 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.925 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.925 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.925 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.926 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.926 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.926 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.926 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.926 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.926 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.926 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.926 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.926 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.926 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.926 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.926 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.927 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.927 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.927 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.927 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.927 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.927 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.927 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.927 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.927 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.927 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.927 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.928 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.928 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.928 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.928 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.928 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.928 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.928 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.928 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.928 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.928 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.928 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.928 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.929 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Oct 08 19:02:07 compute-0 sudo[128608]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dsmxvfadpaszpvmvtvrrltkkoxkpvagp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950126.9605684-578-110074488624178/AnsiballZ_copy.py'
Oct 08 19:02:07 compute-0 sudo[128608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.946 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.947 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Oct 08 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.948 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.039 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Oct 08 19:02:08 compute-0 python3.9[128611]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759950126.9605684-578-110074488624178/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 08 19:02:08 compute-0 sudo[128608]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.198 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.198 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.198 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.198 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.198 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.199 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.199 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.199 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.199 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.199 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.199 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.199 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.199 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.200 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.200 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.200 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.200 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.201 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.201 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.201 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.201 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.201 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.201 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.201 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.201 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.202 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.202 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.202 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.202 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.202 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.202 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.202 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.202 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.203 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.203 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.203 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.203 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.203 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.203 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.203 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.203 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.204 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.204 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.204 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.204 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.204 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.204 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.204 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.204 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.205 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.205 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.205 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.205 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.205 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.205 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.205 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.206 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.206 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.206 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.206 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.206 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.206 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.206 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.206 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.206 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.207 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.207 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.207 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.207 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.207 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.207 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.207 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.208 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.208 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.208 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.208 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.208 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.208 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.208 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.208 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.209 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.209 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.209 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.209 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.209 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.209 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.209 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.209 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.209 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.210 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.210 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.210 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.210 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.210 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.210 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.210 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.210 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.211 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.211 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.211 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.211 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.211 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.211 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.212 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.212 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.212 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.212 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.212 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.212 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.212 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.213 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.213 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.213 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.213 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.213 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.213 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.213 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.213 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.214 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.214 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.214 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.214 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.214 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.214 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.214 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.214 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.215 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.215 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.215 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.215 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.215 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.215 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.215 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.215 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.216 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.216 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.216 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.216 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.216 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.216 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.216 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.216 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.216 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.216 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.217 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.217 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.217 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.217 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.217 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.217 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.217 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.217 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.218 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.218 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.218 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.218 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.218 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.218 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.218 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.218 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.218 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.219 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.219 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.219 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.219 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.219 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.219 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.219 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.220 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.220 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.220 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.220 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.220 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.220 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.220 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.221 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.221 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.221 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.221 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.221 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.221 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.221 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.221 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.221 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.222 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.222 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.222 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.222 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.222 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.222 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.222 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.222 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.223 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.223 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.223 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.223 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.223 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.223 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.223 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.223 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.224 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.224 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.224 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.224 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.224 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.224 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.224 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.225 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.225 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.225 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.225 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.225 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.226 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.226 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.226 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.226 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.226 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.226 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.226 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.230 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.237 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.241 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.242 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.242 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.242 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.242 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.242 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.242 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:02:08 compute-0 sudo[128766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdouahwrdycslwrbxastymgfuljcusgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950128.4764242-595-203398420245306/AnsiballZ_container_config_data.py'
Oct 08 19:02:08 compute-0 sudo[128766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:09 compute-0 python3.9[128768]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=node_exporter.json debug=False
Oct 08 19:02:09 compute-0 sudo[128766]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:09 compute-0 sudo[128918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drtqzlcqymdpfcghfkqerbladsiwczib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950129.3698812-604-35809815788003/AnsiballZ_container_config_hash.py'
Oct 08 19:02:09 compute-0 sudo[128918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:10 compute-0 python3.9[128920]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 08 19:02:10 compute-0 sudo[128918]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:10 compute-0 sudo[129070]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bahqyttkinicsbfzpxyqskemsrdqpssf ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759950130.3428013-614-245091950635803/AnsiballZ_edpm_container_manage.py'
Oct 08 19:02:10 compute-0 sudo[129070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:11 compute-0 python3[129072]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=node_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Oct 08 19:02:11 compute-0 podman[129107]: 2025-10-08 19:02:11.275343904 +0000 UTC m=+0.064693834 container create 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, config_id=edpm, container_name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct 08 19:02:11 compute-0 podman[129107]: 2025-10-08 19:02:11.241813904 +0000 UTC m=+0.031163844 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c
Oct 08 19:02:11 compute-0 python3[129072]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck node_exporter --label config_id=edpm --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c --web.config.file=/etc/node_exporter/node_exporter.yaml --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl
Oct 08 19:02:11 compute-0 sudo[129070]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:11 compute-0 sudo[129295]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydvbkcqzfomucllaidhscdrewivlkvov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950131.589422-622-264098081754285/AnsiballZ_stat.py'
Oct 08 19:02:11 compute-0 sudo[129295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:12 compute-0 python3.9[129297]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 08 19:02:12 compute-0 sudo[129295]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:12 compute-0 sudo[129449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezudydgfyjndsdysxkaxftfojvpsneqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950132.363168-631-55590390360867/AnsiballZ_file.py'
Oct 08 19:02:12 compute-0 sudo[129449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:12 compute-0 python3.9[129451]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:02:12 compute-0 sudo[129449]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:13 compute-0 sudo[129609]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlxfgarzyxubvjpjkzsunqwbimthdccb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950133.0007854-631-156586757807026/AnsiballZ_copy.py'
Oct 08 19:02:13 compute-0 sudo[129609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:13 compute-0 podman[129574]: 2025-10-08 19:02:13.578913339 +0000 UTC m=+0.088195716 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd)
Oct 08 19:02:13 compute-0 python3.9[129617]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759950133.0007854-631-156586757807026/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:02:13 compute-0 sudo[129609]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:14 compute-0 sudo[129692]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eiaffswckcptcpdblfahleylhdmuudbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950133.0007854-631-156586757807026/AnsiballZ_systemd.py'
Oct 08 19:02:14 compute-0 sudo[129692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:14 compute-0 python3.9[129694]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 08 19:02:14 compute-0 systemd[1]: Reloading.
Oct 08 19:02:14 compute-0 systemd-sysv-generator[129723]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 19:02:14 compute-0 systemd-rc-local-generator[129720]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 19:02:14 compute-0 systemd[1]: Starting dnf makecache...
Oct 08 19:02:14 compute-0 sudo[129692]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:14 compute-0 dnf[129729]: Repository 'gating-repo' is missing name in configuration, using id.
Oct 08 19:02:14 compute-0 dnf[129729]: Metadata cache refreshed recently.
Oct 08 19:02:14 compute-0 systemd[1]: dnf-makecache.service: Deactivated successfully.
Oct 08 19:02:14 compute-0 systemd[1]: Finished dnf makecache.
Oct 08 19:02:14 compute-0 sudo[129803]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxgbjvsqyjdkmvuhrcorsipuxhxuaobt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950133.0007854-631-156586757807026/AnsiballZ_systemd.py'
Oct 08 19:02:14 compute-0 sudo[129803]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:15 compute-0 python3.9[129805]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 08 19:02:15 compute-0 systemd[1]: Reloading.
Oct 08 19:02:15 compute-0 systemd-rc-local-generator[129835]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 19:02:15 compute-0 systemd-sysv-generator[129838]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 19:02:15 compute-0 systemd[1]: Starting node_exporter container...
Oct 08 19:02:15 compute-0 systemd[1]: Started libcrun container.
Oct 08 19:02:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/da29446cf246f7a9927e318b3b3f382c7ef69cd14bb17f2b6f9a6d3043149995/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct 08 19:02:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/da29446cf246f7a9927e318b3b3f382c7ef69cd14bb17f2b6f9a6d3043149995/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct 08 19:02:15 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213.
Oct 08 19:02:15 compute-0 podman[129846]: 2025-10-08 19:02:15.805636323 +0000 UTC m=+0.182451777 container init 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct 08 19:02:15 compute-0 podman[129846]: 2025-10-08 19:02:15.83732952 +0000 UTC m=+0.214144934 container start 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 08 19:02:15 compute-0 podman[129846]: node_exporter
Oct 08 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.852Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Oct 08 19:02:15 compute-0 systemd[1]: Started node_exporter container.
Oct 08 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.852Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Oct 08 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.853Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Oct 08 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.854Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Oct 08 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.855Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Oct 08 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.855Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Oct 08 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.855Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Oct 08 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.855Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Oct 08 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.855Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Oct 08 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Oct 08 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=arp
Oct 08 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=bcache
Oct 08 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=bonding
Oct 08 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=btrfs
Oct 08 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=conntrack
Oct 08 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=cpu
Oct 08 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=cpufreq
Oct 08 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=diskstats
Oct 08 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=edac
Oct 08 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=fibrechannel
Oct 08 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=filefd
Oct 08 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=filesystem
Oct 08 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=infiniband
Oct 08 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=ipvs
Oct 08 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=loadavg
Oct 08 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=mdadm
Oct 08 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=meminfo
Oct 08 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=netclass
Oct 08 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=netdev
Oct 08 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=netstat
Oct 08 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=nfs
Oct 08 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=nfsd
Oct 08 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=nvme
Oct 08 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=schedstat
Oct 08 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=sockstat
Oct 08 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=softnet
Oct 08 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=systemd
Oct 08 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=tapestats
Oct 08 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=udp_queues
Oct 08 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=vmstat
Oct 08 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=xfs
Oct 08 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=zfs
Oct 08 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.857Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Oct 08 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.857Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Oct 08 19:02:15 compute-0 sudo[129803]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:15 compute-0 podman[129867]: 2025-10-08 19:02:15.92599526 +0000 UTC m=+0.079518439 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 08 19:02:16 compute-0 sudo[130044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvbbkzlwrnqlkrfdgyzqnodcszxkjyln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950136.1097672-655-161464484714776/AnsiballZ_systemd.py'
Oct 08 19:02:16 compute-0 sudo[130044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:16 compute-0 python3.9[130046]: ansible-ansible.builtin.systemd Invoked with name=edpm_node_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 08 19:02:16 compute-0 systemd[1]: Stopping node_exporter container...
Oct 08 19:02:16 compute-0 systemd[1]: libpod-9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213.scope: Deactivated successfully.
Oct 08 19:02:16 compute-0 podman[130050]: 2025-10-08 19:02:16.850012454 +0000 UTC m=+0.045942197 container died 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 08 19:02:16 compute-0 systemd[1]: 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213-3eedbe080720916f.timer: Deactivated successfully.
Oct 08 19:02:16 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213.
Oct 08 19:02:16 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213-userdata-shm.mount: Deactivated successfully.
Oct 08 19:02:16 compute-0 systemd[1]: var-lib-containers-storage-overlay-da29446cf246f7a9927e318b3b3f382c7ef69cd14bb17f2b6f9a6d3043149995-merged.mount: Deactivated successfully.
Oct 08 19:02:16 compute-0 podman[130050]: 2025-10-08 19:02:16.976453116 +0000 UTC m=+0.172382849 container cleanup 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 08 19:02:16 compute-0 podman[130050]: node_exporter
Oct 08 19:02:16 compute-0 systemd[1]: edpm_node_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Oct 08 19:02:17 compute-0 podman[130083]: node_exporter
Oct 08 19:02:17 compute-0 systemd[1]: edpm_node_exporter.service: Failed with result 'exit-code'.
Oct 08 19:02:17 compute-0 systemd[1]: Stopped node_exporter container.
Oct 08 19:02:17 compute-0 systemd[1]: Starting node_exporter container...
Oct 08 19:02:17 compute-0 systemd[1]: Started libcrun container.
Oct 08 19:02:17 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/da29446cf246f7a9927e318b3b3f382c7ef69cd14bb17f2b6f9a6d3043149995/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct 08 19:02:17 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/da29446cf246f7a9927e318b3b3f382c7ef69cd14bb17f2b6f9a6d3043149995/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct 08 19:02:17 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213.
Oct 08 19:02:17 compute-0 podman[130095]: 2025-10-08 19:02:17.26572442 +0000 UTC m=+0.173198011 container init 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct 08 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.282Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Oct 08 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.282Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Oct 08 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.282Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Oct 08 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.283Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Oct 08 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.283Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Oct 08 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.283Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Oct 08 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.283Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Oct 08 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.284Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Oct 08 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.284Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Oct 08 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.284Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Oct 08 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.284Z caller=node_exporter.go:117 level=info collector=arp
Oct 08 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.284Z caller=node_exporter.go:117 level=info collector=bcache
Oct 08 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.284Z caller=node_exporter.go:117 level=info collector=bonding
Oct 08 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=btrfs
Oct 08 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=conntrack
Oct 08 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=cpu
Oct 08 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=cpufreq
Oct 08 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=diskstats
Oct 08 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=edac
Oct 08 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=fibrechannel
Oct 08 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=filefd
Oct 08 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=filesystem
Oct 08 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=infiniband
Oct 08 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=ipvs
Oct 08 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=loadavg
Oct 08 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=mdadm
Oct 08 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=meminfo
Oct 08 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=netclass
Oct 08 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=netdev
Oct 08 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=netstat
Oct 08 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=nfs
Oct 08 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=nfsd
Oct 08 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=nvme
Oct 08 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=schedstat
Oct 08 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=sockstat
Oct 08 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=softnet
Oct 08 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=systemd
Oct 08 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=tapestats
Oct 08 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=udp_queues
Oct 08 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=vmstat
Oct 08 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=xfs
Oct 08 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=zfs
Oct 08 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.286Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Oct 08 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.286Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Oct 08 19:02:17 compute-0 podman[130095]: 2025-10-08 19:02:17.296461431 +0000 UTC m=+0.203935042 container start 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 08 19:02:17 compute-0 podman[130095]: node_exporter
Oct 08 19:02:17 compute-0 systemd[1]: Started node_exporter container.
Oct 08 19:02:17 compute-0 sudo[130044]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:17 compute-0 podman[130120]: 2025-10-08 19:02:17.392571913 +0000 UTC m=+0.083825231 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 08 19:02:17 compute-0 sudo[130293]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vnzwecdhtuvitkpgtnkzuxezogzamgia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950137.550567-663-61679827133784/AnsiballZ_stat.py'
Oct 08 19:02:17 compute-0 sudo[130293]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:18 compute-0 python3.9[130295]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 19:02:18 compute-0 sudo[130293]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:18 compute-0 sudo[130416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtwepzukwrdzuowkuxlnhxhnvqypjavb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950137.550567-663-61679827133784/AnsiballZ_copy.py'
Oct 08 19:02:18 compute-0 sudo[130416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:18 compute-0 python3.9[130418]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759950137.550567-663-61679827133784/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 08 19:02:18 compute-0 sudo[130416]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:19 compute-0 sudo[130580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnnrxgouinuniqdlsnyaolnenwbqvzbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950138.9868379-680-97349508488918/AnsiballZ_container_config_data.py'
Oct 08 19:02:19 compute-0 sudo[130580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:19 compute-0 podman[130542]: 2025-10-08 19:02:19.390992488 +0000 UTC m=+0.093152519 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 08 19:02:19 compute-0 python3.9[130589]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=podman_exporter.json debug=False
Oct 08 19:02:19 compute-0 sudo[130580]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:20 compute-0 sudo[130753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxwlumauwofgxxesutvpjihwnnkjccpj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950139.8646564-689-2559679600992/AnsiballZ_container_config_hash.py'
Oct 08 19:02:20 compute-0 sudo[130753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:20 compute-0 podman[130714]: 2025-10-08 19:02:20.254301773 +0000 UTC m=+0.063005975 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Oct 08 19:02:20 compute-0 python3.9[130761]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 08 19:02:20 compute-0 sudo[130753]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:21 compute-0 sudo[130924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rutbqxsxnfxiivqljkwcjiluydameslc ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759950140.7274683-699-241621867928208/AnsiballZ_edpm_container_manage.py'
Oct 08 19:02:21 compute-0 sudo[130924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:21 compute-0 podman[130885]: 2025-10-08 19:02:21.178442741 +0000 UTC m=+0.159820028 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct 08 19:02:21 compute-0 python3[130930]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=podman_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Oct 08 19:02:22 compute-0 podman[130952]: 2025-10-08 19:02:22.790830519 +0000 UTC m=+1.361819744 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd
Oct 08 19:02:22 compute-0 podman[131050]: 2025-10-08 19:02:22.917630501 +0000 UTC m=+0.037460794 container create 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=edpm, container_name=podman_exporter)
Oct 08 19:02:22 compute-0 podman[131050]: 2025-10-08 19:02:22.898724209 +0000 UTC m=+0.018554512 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd
Oct 08 19:02:22 compute-0 python3[130930]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env OS_ENDPOINT_TYPE=internal --env CONTAINER_HOST=unix:///run/podman/podman.sock --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=edpm --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Oct 08 19:02:23 compute-0 sudo[130924]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:23 compute-0 sudo[131238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxpknnfqmahtlmdlxezkwknhpwyapovp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950143.3885674-707-35898267086342/AnsiballZ_stat.py'
Oct 08 19:02:23 compute-0 sudo[131238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:23 compute-0 python3.9[131240]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 08 19:02:23 compute-0 sudo[131238]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:24 compute-0 sudo[131392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spfznrgudoceqxalteyxiqtgpngoolgm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950144.1672487-716-27846449059107/AnsiballZ_file.py'
Oct 08 19:02:24 compute-0 sudo[131392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:24 compute-0 python3.9[131394]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:02:24 compute-0 sudo[131392]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:25 compute-0 sudo[131543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lyvlgicpahcupevplhlpwaguhrdkgttc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950144.8437123-716-136675860338874/AnsiballZ_copy.py'
Oct 08 19:02:25 compute-0 sudo[131543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:25 compute-0 python3.9[131545]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759950144.8437123-716-136675860338874/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:02:25 compute-0 sudo[131543]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:25 compute-0 sudo[131619]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfebvumlvailepaotuuwiwjymzjnrwjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950144.8437123-716-136675860338874/AnsiballZ_systemd.py'
Oct 08 19:02:25 compute-0 sudo[131619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:26 compute-0 python3.9[131621]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 08 19:02:26 compute-0 systemd[1]: Reloading.
Oct 08 19:02:26 compute-0 systemd-rc-local-generator[131645]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 19:02:26 compute-0 systemd-sysv-generator[131649]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 19:02:26 compute-0 sudo[131619]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:26 compute-0 sudo[131730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tiwpwvrltolbtgicekrvklrzglilqvjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950144.8437123-716-136675860338874/AnsiballZ_systemd.py'
Oct 08 19:02:26 compute-0 sudo[131730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:27 compute-0 python3.9[131732]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 08 19:02:27 compute-0 systemd[1]: Reloading.
Oct 08 19:02:27 compute-0 systemd-sysv-generator[131768]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 19:02:27 compute-0 systemd-rc-local-generator[131764]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 19:02:27 compute-0 systemd[1]: Starting podman_exporter container...
Oct 08 19:02:27 compute-0 systemd[1]: Started libcrun container.
Oct 08 19:02:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a326ecbc4abca4f807e2a12af1256dd5205b8dffe1da0f2d2bd48cfaf93a0a8a/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct 08 19:02:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a326ecbc4abca4f807e2a12af1256dd5205b8dffe1da0f2d2bd48cfaf93a0a8a/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct 08 19:02:27 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d.
Oct 08 19:02:27 compute-0 podman[131772]: 2025-10-08 19:02:27.673460189 +0000 UTC m=+0.137762107 container init 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 08 19:02:27 compute-0 podman_exporter[131788]: ts=2025-10-08T19:02:27.699Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Oct 08 19:02:27 compute-0 podman_exporter[131788]: ts=2025-10-08T19:02:27.699Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Oct 08 19:02:27 compute-0 podman_exporter[131788]: ts=2025-10-08T19:02:27.699Z caller=handler.go:94 level=info msg="enabled collectors"
Oct 08 19:02:27 compute-0 podman_exporter[131788]: ts=2025-10-08T19:02:27.699Z caller=handler.go:105 level=info collector=container
Oct 08 19:02:27 compute-0 podman[131772]: 2025-10-08 19:02:27.712120336 +0000 UTC m=+0.176422184 container start 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 08 19:02:27 compute-0 podman[131772]: podman_exporter
Oct 08 19:02:27 compute-0 systemd[1]: Starting Podman API Service...
Oct 08 19:02:27 compute-0 systemd[1]: Started podman_exporter container.
Oct 08 19:02:27 compute-0 systemd[1]: Started Podman API Service.
Oct 08 19:02:27 compute-0 sudo[131730]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:27 compute-0 podman[131799]: time="2025-10-08T19:02:27Z" level=info msg="/usr/bin/podman filtering at log level info"
Oct 08 19:02:27 compute-0 podman[131799]: time="2025-10-08T19:02:27Z" level=info msg="Setting parallel job count to 25"
Oct 08 19:02:27 compute-0 podman[131799]: time="2025-10-08T19:02:27Z" level=info msg="Using sqlite as database backend"
Oct 08 19:02:27 compute-0 podman[131799]: time="2025-10-08T19:02:27Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Oct 08 19:02:27 compute-0 podman[131799]: time="2025-10-08T19:02:27Z" level=info msg="Using systemd socket activation to determine API endpoint"
Oct 08 19:02:27 compute-0 podman[131799]: time="2025-10-08T19:02:27Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Oct 08 19:02:27 compute-0 podman[131799]: @ - - [08/Oct/2025:19:02:27 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Oct 08 19:02:27 compute-0 podman[131799]: time="2025-10-08T19:02:27Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 08 19:02:27 compute-0 podman[131798]: 2025-10-08 19:02:27.820941943 +0000 UTC m=+0.095170177 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 08 19:02:27 compute-0 systemd[1]: 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d-8d3396b962ac6b3.service: Main process exited, code=exited, status=1/FAILURE
Oct 08 19:02:27 compute-0 systemd[1]: 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d-8d3396b962ac6b3.service: Failed with result 'exit-code'.
Oct 08 19:02:27 compute-0 podman[131799]: @ - - [08/Oct/2025:19:02:27 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 23086 "" "Go-http-client/1.1"
Oct 08 19:02:27 compute-0 podman_exporter[131788]: ts=2025-10-08T19:02:27.845Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Oct 08 19:02:27 compute-0 podman_exporter[131788]: ts=2025-10-08T19:02:27.846Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Oct 08 19:02:27 compute-0 podman_exporter[131788]: ts=2025-10-08T19:02:27.846Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Oct 08 19:02:28 compute-0 sudo[131983]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zpdmhurracmfwidiuvhafxbfklxrcxpd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950147.9686608-740-113661883411614/AnsiballZ_systemd.py'
Oct 08 19:02:28 compute-0 sudo[131983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:28 compute-0 python3.9[131985]: ansible-ansible.builtin.systemd Invoked with name=edpm_podman_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 08 19:02:28 compute-0 systemd[1]: Stopping podman_exporter container...
Oct 08 19:02:28 compute-0 podman[131799]: @ - - [08/Oct/2025:19:02:27 +0000] "GET /v4.9.3/libpod/events?filters=%7B%7D&since=&stream=true&until= HTTP/1.1" 200 1641 "" "Go-http-client/1.1"
Oct 08 19:02:28 compute-0 systemd[1]: libpod-9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d.scope: Deactivated successfully.
Oct 08 19:02:28 compute-0 podman[131989]: 2025-10-08 19:02:28.806586941 +0000 UTC m=+0.069544813 container died 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 08 19:02:28 compute-0 systemd[1]: 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d-8d3396b962ac6b3.timer: Deactivated successfully.
Oct 08 19:02:28 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d.
Oct 08 19:02:28 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d-userdata-shm.mount: Deactivated successfully.
Oct 08 19:02:28 compute-0 systemd[1]: var-lib-containers-storage-overlay-a326ecbc4abca4f807e2a12af1256dd5205b8dffe1da0f2d2bd48cfaf93a0a8a-merged.mount: Deactivated successfully.
Oct 08 19:02:29 compute-0 podman[131989]: 2025-10-08 19:02:29.141480422 +0000 UTC m=+0.404438294 container cleanup 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 08 19:02:29 compute-0 podman[131989]: podman_exporter
Oct 08 19:02:29 compute-0 systemd[1]: edpm_podman_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Oct 08 19:02:29 compute-0 podman[132022]: podman_exporter
Oct 08 19:02:29 compute-0 systemd[1]: edpm_podman_exporter.service: Failed with result 'exit-code'.
Oct 08 19:02:29 compute-0 systemd[1]: Stopped podman_exporter container.
Oct 08 19:02:29 compute-0 systemd[1]: Starting podman_exporter container...
Oct 08 19:02:29 compute-0 systemd[1]: Started libcrun container.
Oct 08 19:02:29 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a326ecbc4abca4f807e2a12af1256dd5205b8dffe1da0f2d2bd48cfaf93a0a8a/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct 08 19:02:29 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a326ecbc4abca4f807e2a12af1256dd5205b8dffe1da0f2d2bd48cfaf93a0a8a/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct 08 19:02:29 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d.
Oct 08 19:02:29 compute-0 podman[132035]: 2025-10-08 19:02:29.499506337 +0000 UTC m=+0.232656275 container init 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 08 19:02:29 compute-0 podman_exporter[132050]: ts=2025-10-08T19:02:29.522Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Oct 08 19:02:29 compute-0 podman_exporter[132050]: ts=2025-10-08T19:02:29.522Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Oct 08 19:02:29 compute-0 podman_exporter[132050]: ts=2025-10-08T19:02:29.522Z caller=handler.go:94 level=info msg="enabled collectors"
Oct 08 19:02:29 compute-0 podman_exporter[132050]: ts=2025-10-08T19:02:29.522Z caller=handler.go:105 level=info collector=container
Oct 08 19:02:29 compute-0 podman[131799]: @ - - [08/Oct/2025:19:02:29 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Oct 08 19:02:29 compute-0 podman[131799]: time="2025-10-08T19:02:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 08 19:02:29 compute-0 podman[132035]: 2025-10-08 19:02:29.54815699 +0000 UTC m=+0.281306868 container start 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 08 19:02:29 compute-0 podman[132035]: podman_exporter
Oct 08 19:02:29 compute-0 systemd[1]: Started podman_exporter container.
Oct 08 19:02:29 compute-0 sudo[131983]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:29 compute-0 podman[131799]: @ - - [08/Oct/2025:19:02:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 23088 "" "Go-http-client/1.1"
Oct 08 19:02:29 compute-0 podman_exporter[132050]: ts=2025-10-08T19:02:29.648Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Oct 08 19:02:29 compute-0 podman_exporter[132050]: ts=2025-10-08T19:02:29.649Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Oct 08 19:02:29 compute-0 podman_exporter[132050]: ts=2025-10-08T19:02:29.650Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Oct 08 19:02:29 compute-0 podman[132060]: 2025-10-08 19:02:29.65326067 +0000 UTC m=+0.095450535 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 08 19:02:29 compute-0 systemd[1]: 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d-7d9abd0262b7ee7b.service: Main process exited, code=exited, status=1/FAILURE
Oct 08 19:02:29 compute-0 systemd[1]: 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d-7d9abd0262b7ee7b.service: Failed with result 'exit-code'.
Oct 08 19:02:30 compute-0 sudo[132231]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-paoytfjwcbhrgzhpsjtfinclwqdesmsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950149.851642-748-83928741393237/AnsiballZ_stat.py'
Oct 08 19:02:30 compute-0 sudo[132231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:30 compute-0 python3.9[132233]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 19:02:30 compute-0 sudo[132231]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:30 compute-0 sudo[132354]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgbrgwsqxritbpylmsciwpkuwpjlidpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950149.851642-748-83928741393237/AnsiballZ_copy.py'
Oct 08 19:02:30 compute-0 sudo[132354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:31 compute-0 python3.9[132356]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759950149.851642-748-83928741393237/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 08 19:02:31 compute-0 sudo[132354]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:31 compute-0 sudo[132506]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npczdpwyhqrfwlrxdrvojieogcqxcxcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950151.4787323-765-17915508087090/AnsiballZ_container_config_data.py'
Oct 08 19:02:31 compute-0 sudo[132506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:32 compute-0 python3.9[132508]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=openstack_network_exporter.json debug=False
Oct 08 19:02:32 compute-0 sudo[132506]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:32 compute-0 sudo[132658]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgvvhxksrvwmqbazhlvohcwthsiyzkiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950152.2441425-774-89000574783195/AnsiballZ_container_config_hash.py'
Oct 08 19:02:32 compute-0 sudo[132658]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:32 compute-0 python3.9[132660]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 08 19:02:32 compute-0 sudo[132658]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:33 compute-0 sudo[132810]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixysjajolmtauldzqwaklmtcidcatfci ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759950153.136388-784-17602253310261/AnsiballZ_edpm_container_manage.py'
Oct 08 19:02:33 compute-0 sudo[132810]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:33 compute-0 python3[132812]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=openstack_network_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Oct 08 19:02:36 compute-0 podman[132825]: 2025-10-08 19:02:36.353532347 +0000 UTC m=+2.521581659 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7
Oct 08 19:02:36 compute-0 podman[132923]: 2025-10-08 19:02:36.569637986 +0000 UTC m=+0.116159608 container create 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, name=ubi9-minimal, distribution-scope=public, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, version=9.6, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350)
Oct 08 19:02:36 compute-0 podman[132923]: 2025-10-08 19:02:36.487449672 +0000 UTC m=+0.033971334 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7
Oct 08 19:02:36 compute-0 python3[132812]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OS_ENDPOINT_TYPE=internal --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=edpm --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7
Oct 08 19:02:36 compute-0 sudo[132810]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:37 compute-0 sudo[133125]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftohjjvlxvfwdavmflwlpltvqerqbyhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950156.9724383-792-99747887701250/AnsiballZ_stat.py'
Oct 08 19:02:37 compute-0 sudo[133125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:37 compute-0 podman[133085]: 2025-10-08 19:02:37.353323491 +0000 UTC m=+0.083177403 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=2, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3)
Oct 08 19:02:37 compute-0 systemd[1]: e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f-39eec91179f1428e.service: Main process exited, code=exited, status=1/FAILURE
Oct 08 19:02:37 compute-0 systemd[1]: e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f-39eec91179f1428e.service: Failed with result 'exit-code'.
Oct 08 19:02:37 compute-0 python3.9[133133]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 08 19:02:37 compute-0 sudo[133125]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:38 compute-0 sudo[133285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvgqjacswfesoveddwbkxnufxcmbcmcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950157.8340576-801-196761163274437/AnsiballZ_file.py'
Oct 08 19:02:38 compute-0 sudo[133285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:38 compute-0 python3.9[133287]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:02:38 compute-0 sudo[133285]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:38 compute-0 sudo[133436]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rucagdwmsxgqumzhrunqjquevffybvun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950158.44593-801-150962681282531/AnsiballZ_copy.py'
Oct 08 19:02:38 compute-0 sudo[133436]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:39 compute-0 python3.9[133438]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759950158.44593-801-150962681282531/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:02:39 compute-0 sudo[133436]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:39 compute-0 sudo[133512]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdznhxgtwulvmilgmwbcnhuobexsezpd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950158.44593-801-150962681282531/AnsiballZ_systemd.py'
Oct 08 19:02:39 compute-0 sudo[133512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:39 compute-0 python3.9[133514]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 08 19:02:39 compute-0 systemd[1]: Reloading.
Oct 08 19:02:39 compute-0 systemd-sysv-generator[133546]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 19:02:39 compute-0 systemd-rc-local-generator[133542]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 19:02:40 compute-0 sudo[133512]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:40 compute-0 sudo[133623]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvnkdzjfcbxlhvcqzrimvuzfztifrxad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950158.44593-801-150962681282531/AnsiballZ_systemd.py'
Oct 08 19:02:40 compute-0 sudo[133623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:40 compute-0 python3.9[133625]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 08 19:02:40 compute-0 systemd[1]: Reloading.
Oct 08 19:02:40 compute-0 systemd-rc-local-generator[133653]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 08 19:02:40 compute-0 systemd-sysv-generator[133657]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 08 19:02:41 compute-0 systemd[1]: Starting openstack_network_exporter container...
Oct 08 19:02:41 compute-0 systemd[1]: Started libcrun container.
Oct 08 19:02:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/60cc909f2f382fd0d56d9aa7171681d4a4fdb2c9939442427688045fe76af904/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Oct 08 19:02:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/60cc909f2f382fd0d56d9aa7171681d4a4fdb2c9939442427688045fe76af904/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct 08 19:02:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/60cc909f2f382fd0d56d9aa7171681d4a4fdb2c9939442427688045fe76af904/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct 08 19:02:41 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2.
Oct 08 19:02:41 compute-0 podman[133664]: 2025-10-08 19:02:41.347015611 +0000 UTC m=+0.158865931 container init 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, release=1755695350)
Oct 08 19:02:41 compute-0 openstack_network_exporter[133679]: INFO    19:02:41 main.go:48: registering *bridge.Collector
Oct 08 19:02:41 compute-0 openstack_network_exporter[133679]: INFO    19:02:41 main.go:48: registering *coverage.Collector
Oct 08 19:02:41 compute-0 openstack_network_exporter[133679]: INFO    19:02:41 main.go:48: registering *datapath.Collector
Oct 08 19:02:41 compute-0 openstack_network_exporter[133679]: INFO    19:02:41 main.go:48: registering *iface.Collector
Oct 08 19:02:41 compute-0 openstack_network_exporter[133679]: INFO    19:02:41 main.go:48: registering *memory.Collector
Oct 08 19:02:41 compute-0 openstack_network_exporter[133679]: INFO    19:02:41 main.go:48: registering *ovnnorthd.Collector
Oct 08 19:02:41 compute-0 openstack_network_exporter[133679]: INFO    19:02:41 main.go:48: registering *ovn.Collector
Oct 08 19:02:41 compute-0 openstack_network_exporter[133679]: INFO    19:02:41 main.go:48: registering *ovsdbserver.Collector
Oct 08 19:02:41 compute-0 openstack_network_exporter[133679]: INFO    19:02:41 main.go:48: registering *pmd_perf.Collector
Oct 08 19:02:41 compute-0 openstack_network_exporter[133679]: INFO    19:02:41 main.go:48: registering *pmd_rxq.Collector
Oct 08 19:02:41 compute-0 openstack_network_exporter[133679]: INFO    19:02:41 main.go:48: registering *vswitch.Collector
Oct 08 19:02:41 compute-0 openstack_network_exporter[133679]: NOTICE  19:02:41 main.go:76: listening on https://:9105/metrics
Oct 08 19:02:41 compute-0 podman[133664]: 2025-10-08 19:02:41.378071531 +0000 UTC m=+0.189921841 container start 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_id=edpm, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., maintainer=Red Hat, Inc., io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, architecture=x86_64, distribution-scope=public)
Oct 08 19:02:41 compute-0 podman[133664]: openstack_network_exporter
Oct 08 19:02:41 compute-0 systemd[1]: Started openstack_network_exporter container.
Oct 08 19:02:41 compute-0 sudo[133623]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:41 compute-0 podman[133689]: 2025-10-08 19:02:41.5030291 +0000 UTC m=+0.107759178 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, name=ubi9-minimal, build-date=2025-08-20T13:12:41, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-type=git, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., distribution-scope=public)
Oct 08 19:02:42 compute-0 sudo[133862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-booipejpefqrftktaeydagugwplmfjrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950161.655377-825-172968390990995/AnsiballZ_systemd.py'
Oct 08 19:02:42 compute-0 sudo[133862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:42 compute-0 python3.9[133864]: ansible-ansible.builtin.systemd Invoked with name=edpm_openstack_network_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 08 19:02:42 compute-0 systemd[1]: Stopping openstack_network_exporter container...
Oct 08 19:02:42 compute-0 systemd[1]: libpod-58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2.scope: Deactivated successfully.
Oct 08 19:02:42 compute-0 podman[133868]: 2025-10-08 19:02:42.462501279 +0000 UTC m=+0.059975859 container died 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.buildah.version=1.33.7, distribution-scope=public, io.openshift.expose-services=, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, build-date=2025-08-20T13:12:41, config_id=edpm, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct 08 19:02:42 compute-0 systemd[1]: 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2-71d9224341fc07c4.timer: Deactivated successfully.
Oct 08 19:02:42 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2.
Oct 08 19:02:42 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2-userdata-shm.mount: Deactivated successfully.
Oct 08 19:02:42 compute-0 systemd[1]: var-lib-containers-storage-overlay-60cc909f2f382fd0d56d9aa7171681d4a4fdb2c9939442427688045fe76af904-merged.mount: Deactivated successfully.
Oct 08 19:02:42 compute-0 podman[133868]: 2025-10-08 19:02:42.938209512 +0000 UTC m=+0.535684112 container cleanup 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, architecture=x86_64, container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1755695350, version=9.6, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct 08 19:02:42 compute-0 podman[133868]: openstack_network_exporter
Oct 08 19:02:42 compute-0 systemd[1]: edpm_openstack_network_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Oct 08 19:02:43 compute-0 podman[133897]: openstack_network_exporter
Oct 08 19:02:43 compute-0 systemd[1]: edpm_openstack_network_exporter.service: Failed with result 'exit-code'.
Oct 08 19:02:43 compute-0 systemd[1]: Stopped openstack_network_exporter container.
Oct 08 19:02:43 compute-0 systemd[1]: Starting openstack_network_exporter container...
Oct 08 19:02:43 compute-0 systemd[1]: Started libcrun container.
Oct 08 19:02:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/60cc909f2f382fd0d56d9aa7171681d4a4fdb2c9939442427688045fe76af904/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Oct 08 19:02:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/60cc909f2f382fd0d56d9aa7171681d4a4fdb2c9939442427688045fe76af904/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct 08 19:02:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/60cc909f2f382fd0d56d9aa7171681d4a4fdb2c9939442427688045fe76af904/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct 08 19:02:43 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2.
Oct 08 19:02:43 compute-0 podman[133910]: 2025-10-08 19:02:43.197745155 +0000 UTC m=+0.141550175 container init 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, architecture=x86_64, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, maintainer=Red Hat, Inc.)
Oct 08 19:02:43 compute-0 openstack_network_exporter[133927]: INFO    19:02:43 main.go:48: registering *bridge.Collector
Oct 08 19:02:43 compute-0 openstack_network_exporter[133927]: INFO    19:02:43 main.go:48: registering *coverage.Collector
Oct 08 19:02:43 compute-0 openstack_network_exporter[133927]: INFO    19:02:43 main.go:48: registering *datapath.Collector
Oct 08 19:02:43 compute-0 openstack_network_exporter[133927]: INFO    19:02:43 main.go:48: registering *iface.Collector
Oct 08 19:02:43 compute-0 openstack_network_exporter[133927]: INFO    19:02:43 main.go:48: registering *memory.Collector
Oct 08 19:02:43 compute-0 openstack_network_exporter[133927]: INFO    19:02:43 main.go:48: registering *ovnnorthd.Collector
Oct 08 19:02:43 compute-0 openstack_network_exporter[133927]: INFO    19:02:43 main.go:48: registering *ovn.Collector
Oct 08 19:02:43 compute-0 openstack_network_exporter[133927]: INFO    19:02:43 main.go:48: registering *ovsdbserver.Collector
Oct 08 19:02:43 compute-0 openstack_network_exporter[133927]: INFO    19:02:43 main.go:48: registering *pmd_perf.Collector
Oct 08 19:02:43 compute-0 openstack_network_exporter[133927]: INFO    19:02:43 main.go:48: registering *pmd_rxq.Collector
Oct 08 19:02:43 compute-0 openstack_network_exporter[133927]: INFO    19:02:43 main.go:48: registering *vswitch.Collector
Oct 08 19:02:43 compute-0 openstack_network_exporter[133927]: NOTICE  19:02:43 main.go:76: listening on https://:9105/metrics
Oct 08 19:02:43 compute-0 podman[133910]: 2025-10-08 19:02:43.228464315 +0000 UTC m=+0.172269335 container start 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, version=9.6, distribution-scope=public, architecture=x86_64, managed_by=edpm_ansible, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git)
Oct 08 19:02:43 compute-0 podman[133910]: openstack_network_exporter
Oct 08 19:02:43 compute-0 systemd[1]: Started openstack_network_exporter container.
Oct 08 19:02:43 compute-0 sudo[133862]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:43 compute-0 podman[133937]: 2025-10-08 19:02:43.31768502 +0000 UTC m=+0.077053467 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.buildah.version=1.33.7, vcs-type=git, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, container_name=openstack_network_exporter, io.openshift.expose-services=, release=1755695350, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, distribution-scope=public)
Oct 08 19:02:43 compute-0 sudo[134117]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpxlffsmdgpzricqqgdosfqlykadksyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950163.4969826-833-27247245270432/AnsiballZ_find.py'
Oct 08 19:02:43 compute-0 sudo[134117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:43 compute-0 podman[134081]: 2025-10-08 19:02:43.846125435 +0000 UTC m=+0.075084981 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct 08 19:02:44 compute-0 python3.9[134127]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct 08 19:02:44 compute-0 sudo[134117]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:02:44.221 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:02:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:02:44.222 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:02:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:02:44.222 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:02:44 compute-0 sudo[134277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajzfziubnspgewkrtabdvlbuqellyods ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950164.3690712-843-104646797815159/AnsiballZ_podman_container_info.py'
Oct 08 19:02:44 compute-0 sudo[134277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:45 compute-0 python3.9[134279]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Oct 08 19:02:45 compute-0 sudo[134277]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:45 compute-0 sudo[134442]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qiqggqncojwcjbvmanmdpkwxmmbgxlpe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950165.424152-851-125596121734850/AnsiballZ_podman_container_exec.py'
Oct 08 19:02:45 compute-0 sudo[134442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:46 compute-0 python3.9[134444]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 08 19:02:46 compute-0 systemd[1]: Started libpod-conmon-4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59.scope.
Oct 08 19:02:46 compute-0 podman[134445]: 2025-10-08 19:02:46.300253762 +0000 UTC m=+0.111439612 container exec 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct 08 19:02:46 compute-0 podman[134445]: 2025-10-08 19:02:46.307711906 +0000 UTC m=+0.118897716 container exec_died 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct 08 19:02:46 compute-0 sudo[134442]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:46 compute-0 systemd[1]: libpod-conmon-4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59.scope: Deactivated successfully.
Oct 08 19:02:46 compute-0 sudo[134627]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqxvyemskniljzlvrydjrkisjeshnyma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950166.551589-859-119820981696595/AnsiballZ_podman_container_exec.py'
Oct 08 19:02:46 compute-0 sudo[134627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:47 compute-0 python3.9[134629]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 08 19:02:47 compute-0 systemd[1]: Started libpod-conmon-4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59.scope.
Oct 08 19:02:47 compute-0 podman[134630]: 2025-10-08 19:02:47.23768518 +0000 UTC m=+0.104497144 container exec 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 08 19:02:47 compute-0 podman[134630]: 2025-10-08 19:02:47.271591741 +0000 UTC m=+0.138403715 container exec_died 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true)
Oct 08 19:02:47 compute-0 systemd[1]: libpod-conmon-4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59.scope: Deactivated successfully.
Oct 08 19:02:47 compute-0 sudo[134627]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:47 compute-0 podman[134711]: 2025-10-08 19:02:47.681165221 +0000 UTC m=+0.084671566 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 08 19:02:47 compute-0 sudo[134837]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aajaehqizvzfnpmqhnjopyhzqiwxmtfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950167.5292187-867-233493938731041/AnsiballZ_file.py'
Oct 08 19:02:47 compute-0 sudo[134837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:48 compute-0 python3.9[134839]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:02:48 compute-0 sudo[134837]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:48 compute-0 sudo[134989]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umkfncrvcuusuindqkgewazgtgfpzcmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950168.3963947-876-68246985100130/AnsiballZ_podman_container_info.py'
Oct 08 19:02:48 compute-0 sudo[134989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:48 compute-0 python3.9[134991]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Oct 08 19:02:49 compute-0 sudo[134989]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:49 compute-0 podman[135127]: 2025-10-08 19:02:49.645056098 +0000 UTC m=+0.064473608 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Oct 08 19:02:49 compute-0 sudo[135174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfhvaxwcummdawajekkzwvicdywrhhse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950169.31217-884-279167368310218/AnsiballZ_podman_container_exec.py'
Oct 08 19:02:49 compute-0 sudo[135174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:49 compute-0 python3.9[135176]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 08 19:02:49 compute-0 systemd[1]: Started libpod-conmon-80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a.scope.
Oct 08 19:02:50 compute-0 podman[135177]: 2025-10-08 19:02:50.006089198 +0000 UTC m=+0.097501244 container exec 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 08 19:02:50 compute-0 podman[135177]: 2025-10-08 19:02:50.044303832 +0000 UTC m=+0.135715828 container exec_died 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3)
Oct 08 19:02:50 compute-0 sudo[135174]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:50 compute-0 systemd[1]: libpod-conmon-80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a.scope: Deactivated successfully.
Oct 08 19:02:50 compute-0 podman[135330]: 2025-10-08 19:02:50.625725344 +0000 UTC m=+0.075392371 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct 08 19:02:50 compute-0 sudo[135373]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zveicxrxdrlnrugnllbbdknwinlvffvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950170.2590213-892-222447688878903/AnsiballZ_podman_container_exec.py'
Oct 08 19:02:50 compute-0 sudo[135373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:50 compute-0 python3.9[135377]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 08 19:02:50 compute-0 systemd[1]: Started libpod-conmon-80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a.scope.
Oct 08 19:02:50 compute-0 podman[135378]: 2025-10-08 19:02:50.986305311 +0000 UTC m=+0.102194348 container exec 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001)
Oct 08 19:02:50 compute-0 podman[135378]: 2025-10-08 19:02:50.996243955 +0000 UTC m=+0.112132972 container exec_died 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct 08 19:02:51 compute-0 sudo[135373]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:51 compute-0 systemd[1]: libpod-conmon-80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a.scope: Deactivated successfully.
Oct 08 19:02:51 compute-0 sudo[135575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmqgbohkkyegnqaqtmiiyoueqkzytpcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950171.2581954-900-215299283873482/AnsiballZ_file.py'
Oct 08 19:02:51 compute-0 sudo[135575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:51 compute-0 podman[135533]: 2025-10-08 19:02:51.728463786 +0000 UTC m=+0.126473343 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct 08 19:02:51 compute-0 python3.9[135583]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:02:51 compute-0 sudo[135575]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:52 compute-0 sudo[135738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yirxwhedngeeinehikkiwlhlrzaldqem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950172.1846554-909-264373872116387/AnsiballZ_podman_container_info.py'
Oct 08 19:02:52 compute-0 sudo[135738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:52 compute-0 python3.9[135740]: ansible-containers.podman.podman_container_info Invoked with name=['iscsid'] executable=podman
Oct 08 19:02:52 compute-0 sudo[135738]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:53 compute-0 sudo[135904]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ievsjjlaijajkusbydddtnwiuekkhkos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950173.0837286-917-44382534425286/AnsiballZ_podman_container_exec.py'
Oct 08 19:02:53 compute-0 sudo[135904]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:53 compute-0 python3.9[135906]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=iscsid detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 08 19:02:54 compute-0 systemd[1]: Started libpod-conmon-3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845.scope.
Oct 08 19:02:54 compute-0 podman[135907]: 2025-10-08 19:02:54.044731735 +0000 UTC m=+0.295569947 container exec 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=iscsid)
Oct 08 19:02:54 compute-0 podman[135907]: 2025-10-08 19:02:54.081313722 +0000 UTC m=+0.332151884 container exec_died 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3)
Oct 08 19:02:54 compute-0 systemd[1]: libpod-conmon-3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845.scope: Deactivated successfully.
Oct 08 19:02:54 compute-0 sudo[135904]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:54 compute-0 sudo[136091]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmsxfkemvutwlylcymsmwxkoyvdvaqys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950174.3467748-925-129097772412154/AnsiballZ_podman_container_exec.py'
Oct 08 19:02:54 compute-0 sudo[136091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:54 compute-0 python3.9[136093]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=iscsid detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 08 19:02:55 compute-0 systemd[1]: Started libpod-conmon-3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845.scope.
Oct 08 19:02:55 compute-0 podman[136094]: 2025-10-08 19:02:55.064764849 +0000 UTC m=+0.078507210 container exec 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.build-date=20251001, io.buildah.version=1.41.3)
Oct 08 19:02:55 compute-0 podman[136094]: 2025-10-08 19:02:55.097328671 +0000 UTC m=+0.111071032 container exec_died 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct 08 19:02:55 compute-0 systemd[1]: libpod-conmon-3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845.scope: Deactivated successfully.
Oct 08 19:02:55 compute-0 sudo[136091]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:55 compute-0 sudo[136273]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmjzxzknwtcicxchzfwrnnpeodvkhbvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950175.360328-933-245639038181634/AnsiballZ_file.py'
Oct 08 19:02:55 compute-0 sudo[136273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:55 compute-0 python3.9[136275]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/iscsid recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:02:56 compute-0 sudo[136273]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:56 compute-0 sudo[136425]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgwwcazrdhmnaedqibfrgyfdcfsfzmni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950176.258861-942-47211054175408/AnsiballZ_podman_container_info.py'
Oct 08 19:02:56 compute-0 sudo[136425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:56 compute-0 python3.9[136427]: ansible-containers.podman.podman_container_info Invoked with name=['multipathd'] executable=podman
Oct 08 19:02:56 compute-0 sudo[136425]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:57 compute-0 sudo[136591]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chkdfxvawtfzfbhxavlbovkmavfhsotv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950177.113771-950-271597893674791/AnsiballZ_podman_container_exec.py'
Oct 08 19:02:57 compute-0 sudo[136591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:57 compute-0 python3.9[136593]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 08 19:02:57 compute-0 systemd[1]: Started libpod-conmon-62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d.scope.
Oct 08 19:02:57 compute-0 podman[136594]: 2025-10-08 19:02:57.763029507 +0000 UTC m=+0.077833850 container exec 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=multipathd)
Oct 08 19:02:57 compute-0 podman[136594]: 2025-10-08 19:02:57.794253721 +0000 UTC m=+0.109058094 container exec_died 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct 08 19:02:57 compute-0 systemd[1]: libpod-conmon-62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d.scope: Deactivated successfully.
Oct 08 19:02:57 compute-0 sudo[136591]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:58 compute-0 sudo[136775]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxnvwbuccnowxnlyfcqvqehbdgfcixcw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950178.0818145-958-129460319891003/AnsiballZ_podman_container_exec.py'
Oct 08 19:02:58 compute-0 sudo[136775]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:58 compute-0 python3.9[136777]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 08 19:02:58 compute-0 systemd[1]: Started libpod-conmon-62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d.scope.
Oct 08 19:02:58 compute-0 podman[136778]: 2025-10-08 19:02:58.78998571 +0000 UTC m=+0.096366361 container exec 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, org.label-schema.build-date=20251001)
Oct 08 19:02:58 compute-0 podman[136778]: 2025-10-08 19:02:58.823287503 +0000 UTC m=+0.129668104 container exec_died 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Oct 08 19:02:58 compute-0 systemd[1]: libpod-conmon-62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d.scope: Deactivated successfully.
Oct 08 19:02:58 compute-0 sudo[136775]: pam_unix(sudo:session): session closed for user root
Oct 08 19:02:59 compute-0 nova_compute[117514]: 2025-10-08 19:02:59.104 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:02:59 compute-0 nova_compute[117514]: 2025-10-08 19:02:59.129 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:02:59 compute-0 nova_compute[117514]: 2025-10-08 19:02:59.129 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:02:59 compute-0 nova_compute[117514]: 2025-10-08 19:02:59.131 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:02:59 compute-0 nova_compute[117514]: 2025-10-08 19:02:59.131 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 08 19:02:59 compute-0 nova_compute[117514]: 2025-10-08 19:02:59.132 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:02:59 compute-0 nova_compute[117514]: 2025-10-08 19:02:59.172 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:02:59 compute-0 nova_compute[117514]: 2025-10-08 19:02:59.173 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:02:59 compute-0 nova_compute[117514]: 2025-10-08 19:02:59.174 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:02:59 compute-0 nova_compute[117514]: 2025-10-08 19:02:59.175 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 08 19:02:59 compute-0 nova_compute[117514]: 2025-10-08 19:02:59.417 2 WARNING nova.virt.libvirt.driver [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 08 19:02:59 compute-0 nova_compute[117514]: 2025-10-08 19:02:59.419 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6293MB free_disk=73.4556770324707GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 08 19:02:59 compute-0 nova_compute[117514]: 2025-10-08 19:02:59.419 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:02:59 compute-0 nova_compute[117514]: 2025-10-08 19:02:59.419 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:02:59 compute-0 sudo[136962]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhxewllixmmpctfcaqcpeioatzevlwyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950179.0482714-966-219614508527541/AnsiballZ_file.py'
Oct 08 19:02:59 compute-0 sudo[136962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:02:59 compute-0 nova_compute[117514]: 2025-10-08 19:02:59.507 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 08 19:02:59 compute-0 nova_compute[117514]: 2025-10-08 19:02:59.508 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 08 19:02:59 compute-0 nova_compute[117514]: 2025-10-08 19:02:59.533 2 DEBUG nova.compute.provider_tree [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 08 19:02:59 compute-0 nova_compute[117514]: 2025-10-08 19:02:59.554 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 08 19:02:59 compute-0 nova_compute[117514]: 2025-10-08 19:02:59.556 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 08 19:02:59 compute-0 nova_compute[117514]: 2025-10-08 19:02:59.556 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.137s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:02:59 compute-0 python3.9[136964]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/multipathd recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:02:59 compute-0 sudo[136962]: pam_unix(sudo:session): session closed for user root
Oct 08 19:03:00 compute-0 nova_compute[117514]: 2025-10-08 19:03:00.165 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:03:00 compute-0 sudo[137125]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qflwysnnshvnhotplytqmzzojovprqbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950179.953272-975-111328046218677/AnsiballZ_podman_container_info.py'
Oct 08 19:03:00 compute-0 sudo[137125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:03:00 compute-0 podman[137088]: 2025-10-08 19:03:00.389460069 +0000 UTC m=+0.089615597 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 08 19:03:00 compute-0 python3.9[137136]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman
Oct 08 19:03:00 compute-0 sudo[137125]: pam_unix(sudo:session): session closed for user root
Oct 08 19:03:00 compute-0 nova_compute[117514]: 2025-10-08 19:03:00.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:03:00 compute-0 nova_compute[117514]: 2025-10-08 19:03:00.717 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 08 19:03:00 compute-0 nova_compute[117514]: 2025-10-08 19:03:00.717 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 08 19:03:00 compute-0 nova_compute[117514]: 2025-10-08 19:03:00.732 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 08 19:03:00 compute-0 nova_compute[117514]: 2025-10-08 19:03:00.733 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:03:00 compute-0 nova_compute[117514]: 2025-10-08 19:03:00.734 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:03:00 compute-0 nova_compute[117514]: 2025-10-08 19:03:00.734 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:03:01 compute-0 sudo[137304]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llrjankwnigijuwntygzfxomqlqllsaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950180.9428806-983-134618142452106/AnsiballZ_podman_container_exec.py'
Oct 08 19:03:01 compute-0 sudo[137304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:03:01 compute-0 python3.9[137306]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 08 19:03:01 compute-0 systemd[1]: Started libpod-conmon-e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f.scope.
Oct 08 19:03:01 compute-0 podman[137307]: 2025-10-08 19:03:01.664927498 +0000 UTC m=+0.120427730 container exec e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 08 19:03:01 compute-0 podman[137307]: 2025-10-08 19:03:01.701362642 +0000 UTC m=+0.156862874 container exec_died e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible)
Oct 08 19:03:01 compute-0 systemd[1]: libpod-conmon-e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f.scope: Deactivated successfully.
Oct 08 19:03:01 compute-0 sudo[137304]: pam_unix(sudo:session): session closed for user root
Oct 08 19:03:02 compute-0 sudo[137488]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhvrbzjuwhcwjzoelwzeojaxqnjtdeyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950181.9692593-991-139121466409919/AnsiballZ_podman_container_exec.py'
Oct 08 19:03:02 compute-0 sudo[137488]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:03:02 compute-0 python3.9[137490]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 08 19:03:02 compute-0 systemd[1]: Started libpod-conmon-e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f.scope.
Oct 08 19:03:02 compute-0 podman[137491]: 2025-10-08 19:03:02.872543715 +0000 UTC m=+0.304102561 container exec e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct 08 19:03:02 compute-0 podman[137491]: 2025-10-08 19:03:02.906359203 +0000 UTC m=+0.337918039 container exec_died e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 08 19:03:02 compute-0 sudo[137488]: pam_unix(sudo:session): session closed for user root
Oct 08 19:03:02 compute-0 systemd[1]: libpod-conmon-e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f.scope: Deactivated successfully.
Oct 08 19:03:03 compute-0 sudo[137673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgfndiyqvwdxsqqysyougrmibsdlnbvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950183.1105871-999-94656933906852/AnsiballZ_file.py'
Oct 08 19:03:03 compute-0 sudo[137673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:03:03 compute-0 python3.9[137675]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:03:03 compute-0 sudo[137673]: pam_unix(sudo:session): session closed for user root
Oct 08 19:03:04 compute-0 sudo[137825]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjqfsrvdabzuyabmlorbemzwgzfxpsps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950183.929221-1008-127705703419959/AnsiballZ_podman_container_info.py'
Oct 08 19:03:04 compute-0 sudo[137825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:03:04 compute-0 python3.9[137827]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman
Oct 08 19:03:04 compute-0 sudo[137825]: pam_unix(sudo:session): session closed for user root
Oct 08 19:03:05 compute-0 sudo[137990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxniirnckxjshuasgflmhesokeuykacj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950184.8186078-1016-269681120736319/AnsiballZ_podman_container_exec.py'
Oct 08 19:03:05 compute-0 sudo[137990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:03:05 compute-0 python3.9[137992]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 08 19:03:05 compute-0 systemd[1]: Started libpod-conmon-9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213.scope.
Oct 08 19:03:05 compute-0 podman[137993]: 2025-10-08 19:03:05.508119708 +0000 UTC m=+0.103874296 container exec 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 08 19:03:05 compute-0 podman[137993]: 2025-10-08 19:03:05.546404434 +0000 UTC m=+0.142159042 container exec_died 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct 08 19:03:05 compute-0 systemd[1]: libpod-conmon-9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213.scope: Deactivated successfully.
Oct 08 19:03:05 compute-0 sudo[137990]: pam_unix(sudo:session): session closed for user root
Oct 08 19:03:06 compute-0 sudo[138174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmpjftcuawhpqxwqzposxkyotfalumcr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950185.838184-1024-10300424653080/AnsiballZ_podman_container_exec.py'
Oct 08 19:03:06 compute-0 sudo[138174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:03:06 compute-0 python3.9[138176]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 08 19:03:06 compute-0 systemd[1]: Started libpod-conmon-9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213.scope.
Oct 08 19:03:06 compute-0 podman[138177]: 2025-10-08 19:03:06.452466614 +0000 UTC m=+0.104266537 container exec 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct 08 19:03:06 compute-0 podman[138177]: 2025-10-08 19:03:06.490588736 +0000 UTC m=+0.142388619 container exec_died 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 08 19:03:06 compute-0 systemd[1]: libpod-conmon-9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213.scope: Deactivated successfully.
Oct 08 19:03:06 compute-0 sudo[138174]: pam_unix(sudo:session): session closed for user root
Oct 08 19:03:07 compute-0 sudo[138356]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wyljwyicbyydiphlgjhfokiwhhdoouua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950186.7318256-1032-25314923412268/AnsiballZ_file.py'
Oct 08 19:03:07 compute-0 sudo[138356]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:03:07 compute-0 python3.9[138358]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:03:07 compute-0 sudo[138356]: pam_unix(sudo:session): session closed for user root
Oct 08 19:03:07 compute-0 podman[138395]: 2025-10-08 19:03:07.685728955 +0000 UTC m=+0.088184586 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute)
Oct 08 19:03:07 compute-0 sudo[138528]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snryuxyltlwzjqoferjfvpzbedawofjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950187.5882583-1041-169473158384515/AnsiballZ_podman_container_info.py'
Oct 08 19:03:07 compute-0 sudo[138528]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:03:08 compute-0 python3.9[138530]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Oct 08 19:03:08 compute-0 sudo[138528]: pam_unix(sudo:session): session closed for user root
Oct 08 19:03:08 compute-0 sudo[138694]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-anbtvajvtoypgdokryfaojwcztxeivob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950188.4782605-1049-197513860074345/AnsiballZ_podman_container_exec.py'
Oct 08 19:03:08 compute-0 sudo[138694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:03:09 compute-0 python3.9[138696]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 08 19:03:09 compute-0 systemd[1]: Started libpod-conmon-9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d.scope.
Oct 08 19:03:09 compute-0 podman[138697]: 2025-10-08 19:03:09.399585628 +0000 UTC m=+0.312782607 container exec 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 08 19:03:09 compute-0 podman[138697]: 2025-10-08 19:03:09.435216497 +0000 UTC m=+0.348413396 container exec_died 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 08 19:03:09 compute-0 systemd[1]: libpod-conmon-9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d.scope: Deactivated successfully.
Oct 08 19:03:09 compute-0 sudo[138694]: pam_unix(sudo:session): session closed for user root
Oct 08 19:03:10 compute-0 sudo[138878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrqlmcrclmazezamsniaoffkqptbntmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950189.7163389-1057-246544594011294/AnsiballZ_podman_container_exec.py'
Oct 08 19:03:10 compute-0 sudo[138878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:03:10 compute-0 python3.9[138880]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 08 19:03:10 compute-0 systemd[1]: Started libpod-conmon-9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d.scope.
Oct 08 19:03:10 compute-0 podman[138881]: 2025-10-08 19:03:10.435496636 +0000 UTC m=+0.106556318 container exec 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 08 19:03:10 compute-0 podman[138881]: 2025-10-08 19:03:10.473642277 +0000 UTC m=+0.144701869 container exec_died 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 08 19:03:10 compute-0 systemd[1]: libpod-conmon-9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d.scope: Deactivated successfully.
Oct 08 19:03:10 compute-0 sudo[138878]: pam_unix(sudo:session): session closed for user root
Oct 08 19:03:11 compute-0 sudo[139062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thjxtvdsdbdkvuvrgijdpfucmtolqfzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950190.7230892-1065-84460898162960/AnsiballZ_file.py'
Oct 08 19:03:11 compute-0 sudo[139062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:03:11 compute-0 python3.9[139064]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:03:11 compute-0 sudo[139062]: pam_unix(sudo:session): session closed for user root
Oct 08 19:03:11 compute-0 sudo[139214]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qblfinfcwuwwtncqyqpvlyrbdmlrjunx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950191.5079303-1074-180059737301673/AnsiballZ_podman_container_info.py'
Oct 08 19:03:11 compute-0 sudo[139214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:03:12 compute-0 python3.9[139216]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Oct 08 19:03:12 compute-0 sudo[139214]: pam_unix(sudo:session): session closed for user root
Oct 08 19:03:12 compute-0 sudo[139379]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhszrmbhoehwmczddsmpzxnclalvtozr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950192.3003855-1082-151391289056774/AnsiballZ_podman_container_exec.py'
Oct 08 19:03:12 compute-0 sudo[139379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:03:12 compute-0 python3.9[139381]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 08 19:03:12 compute-0 systemd[1]: Started libpod-conmon-58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2.scope.
Oct 08 19:03:12 compute-0 podman[139382]: 2025-10-08 19:03:12.957532087 +0000 UTC m=+0.079519295 container exec 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, distribution-scope=public, vendor=Red Hat, Inc., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, name=ubi9-minimal, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, vcs-type=git, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct 08 19:03:12 compute-0 podman[139382]: 2025-10-08 19:03:12.987102833 +0000 UTC m=+0.109090021 container exec_died 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.openshift.expose-services=, managed_by=edpm_ansible, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, release=1755695350, name=ubi9-minimal, config_id=edpm, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter)
Oct 08 19:03:13 compute-0 systemd[1]: libpod-conmon-58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2.scope: Deactivated successfully.
Oct 08 19:03:13 compute-0 sudo[139379]: pam_unix(sudo:session): session closed for user root
Oct 08 19:03:13 compute-0 sudo[139573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vaqoejchcpdxtlzriepqahghwirhipvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950193.2239218-1090-204069329243098/AnsiballZ_podman_container_exec.py'
Oct 08 19:03:13 compute-0 sudo[139573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:03:13 compute-0 podman[139537]: 2025-10-08 19:03:13.60305913 +0000 UTC m=+0.091145458 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, name=ubi9-minimal, vcs-type=git, version=9.6, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., architecture=x86_64, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, release=1755695350)
Oct 08 19:03:13 compute-0 python3.9[139582]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 08 19:03:13 compute-0 systemd[1]: Started libpod-conmon-58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2.scope.
Oct 08 19:03:13 compute-0 podman[139588]: 2025-10-08 19:03:13.907368253 +0000 UTC m=+0.098693563 container exec 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, managed_by=edpm_ansible, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct 08 19:03:13 compute-0 podman[139588]: 2025-10-08 19:03:13.940063789 +0000 UTC m=+0.131389049 container exec_died 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, architecture=x86_64, io.buildah.version=1.33.7, release=1755695350, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41)
Oct 08 19:03:13 compute-0 systemd[1]: libpod-conmon-58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2.scope: Deactivated successfully.
Oct 08 19:03:13 compute-0 podman[139604]: 2025-10-08 19:03:13.983640515 +0000 UTC m=+0.076088927 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct 08 19:03:13 compute-0 sudo[139573]: pam_unix(sudo:session): session closed for user root
Oct 08 19:03:14 compute-0 sudo[139789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oumruenhkvlvkvfplnluwmxswlysqcfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950194.182351-1098-30245051990928/AnsiballZ_file.py'
Oct 08 19:03:14 compute-0 sudo[139789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:03:14 compute-0 python3.9[139791]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:03:14 compute-0 sudo[139789]: pam_unix(sudo:session): session closed for user root
Oct 08 19:03:15 compute-0 sudo[139941]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdufwrkzvnclvesnaiuakufqlyzfttha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950195.0405002-1107-48676773233028/AnsiballZ_file.py'
Oct 08 19:03:15 compute-0 sudo[139941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:03:15 compute-0 python3.9[139943]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:03:15 compute-0 sudo[139941]: pam_unix(sudo:session): session closed for user root
Oct 08 19:03:16 compute-0 sudo[140093]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmtrugwuibnhpsrmfixvmdmwbsgeawmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950195.7130418-1115-55922560681916/AnsiballZ_stat.py'
Oct 08 19:03:16 compute-0 sudo[140093]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:03:16 compute-0 python3.9[140095]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 19:03:16 compute-0 sudo[140093]: pam_unix(sudo:session): session closed for user root
Oct 08 19:03:16 compute-0 sudo[140216]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdakvtdkruztywoqzdbtqceukjpdbbvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950195.7130418-1115-55922560681916/AnsiballZ_copy.py'
Oct 08 19:03:16 compute-0 sudo[140216]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:03:16 compute-0 python3.9[140218]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1759950195.7130418-1115-55922560681916/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:03:16 compute-0 sudo[140216]: pam_unix(sudo:session): session closed for user root
Oct 08 19:03:17 compute-0 sudo[140368]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpwltjjgolufnhhfnbitdnvdbautjkjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950197.1870756-1131-63185505399593/AnsiballZ_file.py'
Oct 08 19:03:17 compute-0 sudo[140368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:03:17 compute-0 python3.9[140370]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:03:17 compute-0 sudo[140368]: pam_unix(sudo:session): session closed for user root
Oct 08 19:03:18 compute-0 sudo[140531]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evqmfgdqgvzznvdusbhsivwblryzxltx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950197.9764593-1139-200833103547515/AnsiballZ_stat.py'
Oct 08 19:03:18 compute-0 sudo[140531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:03:18 compute-0 podman[140494]: 2025-10-08 19:03:18.411806433 +0000 UTC m=+0.077687373 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct 08 19:03:18 compute-0 python3.9[140537]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 19:03:18 compute-0 sudo[140531]: pam_unix(sudo:session): session closed for user root
Oct 08 19:03:18 compute-0 sudo[140620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-niyhjtiekyijuwmwzbmvmcrfaancvidw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950197.9764593-1139-200833103547515/AnsiballZ_file.py'
Oct 08 19:03:18 compute-0 sudo[140620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:03:19 compute-0 python3.9[140622]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:03:19 compute-0 sudo[140620]: pam_unix(sudo:session): session closed for user root
Oct 08 19:03:19 compute-0 sudo[140772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahrdyhxcmavuaiuipiggdyugiukmqsjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950199.2814946-1151-86108736532553/AnsiballZ_stat.py'
Oct 08 19:03:19 compute-0 sudo[140772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:03:19 compute-0 python3.9[140774]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 19:03:19 compute-0 sudo[140772]: pam_unix(sudo:session): session closed for user root
Oct 08 19:03:20 compute-0 podman[140824]: 2025-10-08 19:03:20.113913764 +0000 UTC m=+0.060090990 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 08 19:03:20 compute-0 sudo[140866]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsqtaxvpdfoztuxxzzlenmyylmssxfkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950199.2814946-1151-86108736532553/AnsiballZ_file.py'
Oct 08 19:03:20 compute-0 sudo[140866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:03:20 compute-0 python3.9[140870]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.1hdy_udt recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:03:20 compute-0 sudo[140866]: pam_unix(sudo:session): session closed for user root
Oct 08 19:03:20 compute-0 sudo[141031]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iyxcfurgkgdforwjzmmupmnrrjdtfcwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950200.525517-1163-52028660697223/AnsiballZ_stat.py'
Oct 08 19:03:20 compute-0 sudo[141031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:03:20 compute-0 podman[140994]: 2025-10-08 19:03:20.904836325 +0000 UTC m=+0.071880537 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_metadata_agent)
Oct 08 19:03:21 compute-0 python3.9[141035]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 19:03:21 compute-0 sudo[141031]: pam_unix(sudo:session): session closed for user root
Oct 08 19:03:21 compute-0 sudo[141115]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhmmysroympnkaioglikockgowoebjyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950200.525517-1163-52028660697223/AnsiballZ_file.py'
Oct 08 19:03:21 compute-0 sudo[141115]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:03:21 compute-0 python3.9[141117]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:03:21 compute-0 sudo[141115]: pam_unix(sudo:session): session closed for user root
Oct 08 19:03:22 compute-0 sudo[141277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgxqqzuenrrggigqwulgjntqzttasqai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950201.8644373-1176-9158827223355/AnsiballZ_command.py'
Oct 08 19:03:22 compute-0 sudo[141277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:03:22 compute-0 podman[141241]: 2025-10-08 19:03:22.223815208 +0000 UTC m=+0.119222200 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 08 19:03:22 compute-0 python3.9[141282]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 08 19:03:22 compute-0 sudo[141277]: pam_unix(sudo:session): session closed for user root
Oct 08 19:03:23 compute-0 sudo[141446]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgivdqfmkaidsspmbbqinvksugriyizs ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759950202.602151-1184-203152030387499/AnsiballZ_edpm_nftables_from_files.py'
Oct 08 19:03:23 compute-0 sudo[141446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:03:23 compute-0 python3[141448]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct 08 19:03:23 compute-0 sudo[141446]: pam_unix(sudo:session): session closed for user root
Oct 08 19:03:23 compute-0 sudo[141598]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycebicgeymomuyllmjypkmssxerubcya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950203.4945056-1192-55355009194159/AnsiballZ_stat.py'
Oct 08 19:03:23 compute-0 sudo[141598]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:03:24 compute-0 python3.9[141600]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 19:03:24 compute-0 sudo[141598]: pam_unix(sudo:session): session closed for user root
Oct 08 19:03:24 compute-0 sudo[141676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdomdcmlhtdbiveatqqrsttxcheqqbcy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950203.4945056-1192-55355009194159/AnsiballZ_file.py'
Oct 08 19:03:24 compute-0 sudo[141676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:03:24 compute-0 python3.9[141678]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:03:24 compute-0 sudo[141676]: pam_unix(sudo:session): session closed for user root
Oct 08 19:03:25 compute-0 sudo[141828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwojfkoqfuqtwgwoqieoarlwrjqzuzow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950204.8200357-1204-265574327373303/AnsiballZ_stat.py'
Oct 08 19:03:25 compute-0 sudo[141828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:03:25 compute-0 python3.9[141830]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 19:03:25 compute-0 sudo[141828]: pam_unix(sudo:session): session closed for user root
Oct 08 19:03:25 compute-0 sudo[141906]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdjuqzjmudnimtwwdefnfeoddwvscopc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950204.8200357-1204-265574327373303/AnsiballZ_file.py'
Oct 08 19:03:25 compute-0 sudo[141906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:03:25 compute-0 python3.9[141908]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:03:26 compute-0 sudo[141906]: pam_unix(sudo:session): session closed for user root
Oct 08 19:03:26 compute-0 sudo[142058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-guxmkemddasoonukrdhlrxhziimtlsze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950206.188306-1216-168509479888279/AnsiballZ_stat.py'
Oct 08 19:03:26 compute-0 sudo[142058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:03:26 compute-0 python3.9[142060]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 19:03:26 compute-0 sudo[142058]: pam_unix(sudo:session): session closed for user root
Oct 08 19:03:27 compute-0 sudo[142136]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-buwelzjmypwwkhmvfnhxxuwpeghfqovp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950206.188306-1216-168509479888279/AnsiballZ_file.py'
Oct 08 19:03:27 compute-0 sudo[142136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:03:27 compute-0 python3.9[142138]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:03:27 compute-0 sudo[142136]: pam_unix(sudo:session): session closed for user root
Oct 08 19:03:27 compute-0 sudo[142288]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ropghyfcgkoibugewurlbsxowdyynkrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950207.4185493-1228-38080849221153/AnsiballZ_stat.py'
Oct 08 19:03:27 compute-0 sudo[142288]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:03:27 compute-0 python3.9[142290]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 19:03:28 compute-0 sudo[142288]: pam_unix(sudo:session): session closed for user root
Oct 08 19:03:28 compute-0 sudo[142366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsmpdeybdenprclknqrghbbonvtofpdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950207.4185493-1228-38080849221153/AnsiballZ_file.py'
Oct 08 19:03:28 compute-0 sudo[142366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:03:28 compute-0 python3.9[142368]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:03:28 compute-0 sudo[142366]: pam_unix(sudo:session): session closed for user root
Oct 08 19:03:28 compute-0 sudo[142518]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-loslgemvcifkvbutwqrhxqspruyvxoqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950208.58123-1240-253803914708402/AnsiballZ_stat.py'
Oct 08 19:03:28 compute-0 sudo[142518]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:03:29 compute-0 python3.9[142520]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 08 19:03:29 compute-0 sudo[142518]: pam_unix(sudo:session): session closed for user root
Oct 08 19:03:29 compute-0 sudo[142643]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdrblpijlkrmyrrwpmulxbreiflynfjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950208.58123-1240-253803914708402/AnsiballZ_copy.py'
Oct 08 19:03:29 compute-0 sudo[142643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:03:29 compute-0 python3.9[142645]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759950208.58123-1240-253803914708402/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:03:29 compute-0 sudo[142643]: pam_unix(sudo:session): session closed for user root
Oct 08 19:03:30 compute-0 sudo[142808]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlulnoebrakvdgyrwejrfvsaipnrlfjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950210.1177697-1255-89326488013606/AnsiballZ_file.py'
Oct 08 19:03:30 compute-0 sudo[142808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:03:30 compute-0 podman[142769]: 2025-10-08 19:03:30.564199927 +0000 UTC m=+0.101591527 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 08 19:03:30 compute-0 python3.9[142821]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:03:30 compute-0 sudo[142808]: pam_unix(sudo:session): session closed for user root
Oct 08 19:03:31 compute-0 sudo[142971]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvrxrwcuxmzhsnvrselqcgmaijlufhpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950210.9554288-1263-262687261036021/AnsiballZ_command.py'
Oct 08 19:03:31 compute-0 sudo[142971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:03:31 compute-0 python3.9[142973]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 08 19:03:31 compute-0 sudo[142971]: pam_unix(sudo:session): session closed for user root
Oct 08 19:03:32 compute-0 sudo[143126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vuwergccbhbftzawectmzlbdgwebrntr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950211.7189996-1271-110022527466242/AnsiballZ_blockinfile.py'
Oct 08 19:03:32 compute-0 sudo[143126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:03:32 compute-0 python3.9[143128]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:03:32 compute-0 sudo[143126]: pam_unix(sudo:session): session closed for user root
Oct 08 19:03:33 compute-0 sudo[143278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aoqfxiocezabqgbsanjactwsmmfdfklt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950212.7201734-1280-214741079072615/AnsiballZ_command.py'
Oct 08 19:03:33 compute-0 sudo[143278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:03:33 compute-0 python3.9[143280]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 08 19:03:33 compute-0 sudo[143278]: pam_unix(sudo:session): session closed for user root
Oct 08 19:03:33 compute-0 sudo[143431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spiqofzdwoodydnldvdvuelyctekupuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950213.5098135-1288-23628574332082/AnsiballZ_stat.py'
Oct 08 19:03:33 compute-0 sudo[143431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:03:34 compute-0 python3.9[143433]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 08 19:03:34 compute-0 sudo[143431]: pam_unix(sudo:session): session closed for user root
Oct 08 19:03:34 compute-0 sudo[143585]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqkardoaxpqnatwwaqysvhczqqzfeerw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950214.250453-1296-17469838936884/AnsiballZ_command.py'
Oct 08 19:03:34 compute-0 sudo[143585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:03:34 compute-0 python3.9[143587]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 08 19:03:34 compute-0 sudo[143585]: pam_unix(sudo:session): session closed for user root
Oct 08 19:03:35 compute-0 sudo[143740]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdbgdlaspyuzllnykxpzdzrnltkysjvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759950215.0864043-1304-61243615240037/AnsiballZ_file.py'
Oct 08 19:03:35 compute-0 sudo[143740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:03:35 compute-0 python3.9[143742]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 08 19:03:35 compute-0 sudo[143740]: pam_unix(sudo:session): session closed for user root
Oct 08 19:03:36 compute-0 sshd-session[117862]: Connection closed by 192.168.122.30 port 58912
Oct 08 19:03:36 compute-0 sshd-session[117858]: pam_unix(sshd:session): session closed for user zuul
Oct 08 19:03:36 compute-0 systemd[1]: session-11.scope: Deactivated successfully.
Oct 08 19:03:36 compute-0 systemd[1]: session-11.scope: Consumed 1min 59.818s CPU time.
Oct 08 19:03:36 compute-0 systemd-logind[844]: Session 11 logged out. Waiting for processes to exit.
Oct 08 19:03:36 compute-0 systemd-logind[844]: Removed session 11.
Oct 08 19:03:38 compute-0 podman[143767]: 2025-10-08 19:03:38.691392199 +0000 UTC m=+0.104698566 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct 08 19:03:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:03:44.223 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:03:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:03:44.223 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:03:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:03:44.223 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:03:44 compute-0 podman[143789]: 2025-10-08 19:03:44.637956794 +0000 UTC m=+0.056485817 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, org.label-schema.vendor=CentOS)
Oct 08 19:03:44 compute-0 podman[143788]: 2025-10-08 19:03:44.653700364 +0000 UTC m=+0.068965494 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vcs-type=git, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, distribution-scope=public, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, managed_by=edpm_ansible, release=1755695350)
Oct 08 19:03:46 compute-0 systemd[1]: Stopping User Manager for UID 1000...
Oct 08 19:03:46 compute-0 systemd[1314]: Activating special unit Exit the Session...
Oct 08 19:03:46 compute-0 systemd[1314]: Removed slice User Background Tasks Slice.
Oct 08 19:03:46 compute-0 systemd[1314]: Stopped target Main User Target.
Oct 08 19:03:46 compute-0 systemd[1314]: Stopped target Basic System.
Oct 08 19:03:46 compute-0 systemd[1314]: Stopped target Paths.
Oct 08 19:03:46 compute-0 systemd[1314]: Stopped target Sockets.
Oct 08 19:03:46 compute-0 systemd[1314]: Stopped target Timers.
Oct 08 19:03:46 compute-0 systemd[1314]: Stopped Mark boot as successful after the user session has run 2 minutes.
Oct 08 19:03:46 compute-0 systemd[1314]: Stopped Daily Cleanup of User's Temporary Directories.
Oct 08 19:03:46 compute-0 systemd[1314]: Closed D-Bus User Message Bus Socket.
Oct 08 19:03:46 compute-0 systemd[1314]: Stopped Create User's Volatile Files and Directories.
Oct 08 19:03:46 compute-0 systemd[1314]: Removed slice User Application Slice.
Oct 08 19:03:46 compute-0 systemd[1314]: Reached target Shutdown.
Oct 08 19:03:46 compute-0 systemd[1314]: Finished Exit the Session.
Oct 08 19:03:46 compute-0 systemd[1314]: Reached target Exit the Session.
Oct 08 19:03:46 compute-0 systemd[1]: user@1000.service: Deactivated successfully.
Oct 08 19:03:46 compute-0 systemd[1]: Stopped User Manager for UID 1000.
Oct 08 19:03:46 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/1000...
Oct 08 19:03:46 compute-0 systemd[1]: run-user-1000.mount: Deactivated successfully.
Oct 08 19:03:46 compute-0 systemd[1]: user-runtime-dir@1000.service: Deactivated successfully.
Oct 08 19:03:46 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/1000.
Oct 08 19:03:46 compute-0 systemd[1]: Removed slice User Slice of UID 1000.
Oct 08 19:03:46 compute-0 systemd[1]: user-1000.slice: Consumed 10min 35.549s CPU time.
Oct 08 19:03:48 compute-0 podman[143830]: 2025-10-08 19:03:48.667017896 +0000 UTC m=+0.085690072 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct 08 19:03:49 compute-0 systemd[1]: Starting Cleanup of Temporary Directories...
Oct 08 19:03:49 compute-0 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Oct 08 19:03:49 compute-0 systemd[1]: Finished Cleanup of Temporary Directories.
Oct 08 19:03:49 compute-0 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Oct 08 19:03:50 compute-0 podman[143858]: 2025-10-08 19:03:50.701182374 +0000 UTC m=+0.113431187 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=iscsid, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 08 19:03:51 compute-0 podman[143879]: 2025-10-08 19:03:51.639365306 +0000 UTC m=+0.061095409 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 08 19:03:52 compute-0 podman[143899]: 2025-10-08 19:03:52.719719676 +0000 UTC m=+0.135457015 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Oct 08 19:03:59 compute-0 nova_compute[117514]: 2025-10-08 19:03:59.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:03:59 compute-0 nova_compute[117514]: 2025-10-08 19:03:59.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:03:59 compute-0 nova_compute[117514]: 2025-10-08 19:03:59.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:03:59 compute-0 nova_compute[117514]: 2025-10-08 19:03:59.717 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 08 19:03:59 compute-0 nova_compute[117514]: 2025-10-08 19:03:59.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:03:59 compute-0 nova_compute[117514]: 2025-10-08 19:03:59.811 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:03:59 compute-0 nova_compute[117514]: 2025-10-08 19:03:59.812 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:03:59 compute-0 nova_compute[117514]: 2025-10-08 19:03:59.813 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:03:59 compute-0 nova_compute[117514]: 2025-10-08 19:03:59.813 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 08 19:03:59 compute-0 nova_compute[117514]: 2025-10-08 19:03:59.976 2 WARNING nova.virt.libvirt.driver [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 08 19:03:59 compute-0 nova_compute[117514]: 2025-10-08 19:03:59.977 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6417MB free_disk=73.45592880249023GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 08 19:03:59 compute-0 nova_compute[117514]: 2025-10-08 19:03:59.977 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:03:59 compute-0 nova_compute[117514]: 2025-10-08 19:03:59.977 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:04:00 compute-0 nova_compute[117514]: 2025-10-08 19:04:00.039 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 08 19:04:00 compute-0 nova_compute[117514]: 2025-10-08 19:04:00.039 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 08 19:04:00 compute-0 nova_compute[117514]: 2025-10-08 19:04:00.063 2 DEBUG nova.compute.provider_tree [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 08 19:04:00 compute-0 nova_compute[117514]: 2025-10-08 19:04:00.080 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 08 19:04:00 compute-0 nova_compute[117514]: 2025-10-08 19:04:00.082 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 08 19:04:00 compute-0 nova_compute[117514]: 2025-10-08 19:04:00.082 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:04:01 compute-0 nova_compute[117514]: 2025-10-08 19:04:01.082 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:04:01 compute-0 podman[143927]: 2025-10-08 19:04:01.627640846 +0000 UTC m=+0.056508348 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 08 19:04:01 compute-0 nova_compute[117514]: 2025-10-08 19:04:01.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:04:02 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:04:02.201 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a6:75:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '5e:14:dd:63:55:2a'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 08 19:04:02 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:04:02.202 28643 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 08 19:04:02 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:04:02.203 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f81f7a-64d8-418a-a74c-b879bd6deb83, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:04:02 compute-0 nova_compute[117514]: 2025-10-08 19:04:02.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:04:02 compute-0 nova_compute[117514]: 2025-10-08 19:04:02.717 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 08 19:04:02 compute-0 nova_compute[117514]: 2025-10-08 19:04:02.717 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 08 19:04:02 compute-0 nova_compute[117514]: 2025-10-08 19:04:02.738 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 08 19:04:02 compute-0 nova_compute[117514]: 2025-10-08 19:04:02.738 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:04:02 compute-0 nova_compute[117514]: 2025-10-08 19:04:02.739 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:04:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:04:08.240 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:04:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:04:08.241 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:04:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:04:08.241 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:04:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:04:08.241 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:04:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:04:08.241 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:04:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:04:08.241 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:04:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:04:08.242 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:04:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:04:08.242 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:04:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:04:08.242 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:04:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:04:08.242 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:04:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:04:08.242 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:04:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:04:08.242 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:04:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:04:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:04:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:04:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:04:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:04:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:04:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:04:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:04:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:04:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:04:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:04:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:04:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:04:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:04:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:04:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:04:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:04:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:04:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:04:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:04:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:04:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:04:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:04:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:04:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:04:08.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:04:09 compute-0 podman[143951]: 2025-10-08 19:04:09.672256146 +0000 UTC m=+0.084479987 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct 08 19:04:15 compute-0 podman[143972]: 2025-10-08 19:04:15.682642518 +0000 UTC m=+0.081173974 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9-minimal)
Oct 08 19:04:15 compute-0 podman[143973]: 2025-10-08 19:04:15.71675427 +0000 UTC m=+0.111370127 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct 08 19:04:19 compute-0 podman[144011]: 2025-10-08 19:04:19.638416775 +0000 UTC m=+0.058580289 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 08 19:04:21 compute-0 podman[144035]: 2025-10-08 19:04:21.643920093 +0000 UTC m=+0.066956371 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=iscsid)
Oct 08 19:04:21 compute-0 podman[144055]: 2025-10-08 19:04:21.736928232 +0000 UTC m=+0.052568683 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct 08 19:04:23 compute-0 podman[144077]: 2025-10-08 19:04:23.700680195 +0000 UTC m=+0.116906820 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Oct 08 19:04:32 compute-0 podman[144103]: 2025-10-08 19:04:32.661187107 +0000 UTC m=+0.066913850 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 08 19:04:34 compute-0 PackageKit[55288]: daemon quit
Oct 08 19:04:34 compute-0 systemd[1]: packagekit.service: Deactivated successfully.
Oct 08 19:04:40 compute-0 podman[144130]: 2025-10-08 19:04:40.659059148 +0000 UTC m=+0.073807329 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, io.buildah.version=1.41.3)
Oct 08 19:04:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:04:44.224 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:04:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:04:44.224 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:04:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:04:44.224 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:04:46 compute-0 podman[144151]: 2025-10-08 19:04:46.669633425 +0000 UTC m=+0.083505608 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3)
Oct 08 19:04:46 compute-0 podman[144150]: 2025-10-08 19:04:46.670121728 +0000 UTC m=+0.086581122 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vendor=Red Hat, Inc., config_id=edpm, distribution-scope=public, architecture=x86_64, managed_by=edpm_ansible, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7)
Oct 08 19:04:50 compute-0 podman[144193]: 2025-10-08 19:04:50.629452395 +0000 UTC m=+0.053912720 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 08 19:04:52 compute-0 podman[144218]: 2025-10-08 19:04:52.670543534 +0000 UTC m=+0.085021369 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct 08 19:04:52 compute-0 podman[144219]: 2025-10-08 19:04:52.675299486 +0000 UTC m=+0.083375774 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 08 19:04:54 compute-0 podman[144255]: 2025-10-08 19:04:54.734844546 +0000 UTC m=+0.146546949 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 08 19:04:58 compute-0 sshd-session[144281]: Invalid user hadoop from 193.32.162.151 port 47762
Oct 08 19:04:58 compute-0 sshd-session[144281]: pam_unix(sshd:auth): check pass; user unknown
Oct 08 19:04:58 compute-0 sshd-session[144281]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.32.162.151
Oct 08 19:05:00 compute-0 nova_compute[117514]: 2025-10-08 19:05:00.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:05:00 compute-0 nova_compute[117514]: 2025-10-08 19:05:00.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:05:00 compute-0 nova_compute[117514]: 2025-10-08 19:05:00.742 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:05:00 compute-0 sshd-session[144281]: Failed password for invalid user hadoop from 193.32.162.151 port 47762 ssh2
Oct 08 19:05:01 compute-0 nova_compute[117514]: 2025-10-08 19:05:01.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:05:01 compute-0 nova_compute[117514]: 2025-10-08 19:05:01.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:05:01 compute-0 nova_compute[117514]: 2025-10-08 19:05:01.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:05:01 compute-0 nova_compute[117514]: 2025-10-08 19:05:01.717 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 08 19:05:01 compute-0 nova_compute[117514]: 2025-10-08 19:05:01.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:05:01 compute-0 nova_compute[117514]: 2025-10-08 19:05:01.745 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:05:01 compute-0 nova_compute[117514]: 2025-10-08 19:05:01.745 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:05:01 compute-0 nova_compute[117514]: 2025-10-08 19:05:01.745 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:05:01 compute-0 nova_compute[117514]: 2025-10-08 19:05:01.745 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 08 19:05:01 compute-0 nova_compute[117514]: 2025-10-08 19:05:01.958 2 WARNING nova.virt.libvirt.driver [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 08 19:05:01 compute-0 nova_compute[117514]: 2025-10-08 19:05:01.959 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6430MB free_disk=73.45976257324219GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 08 19:05:01 compute-0 nova_compute[117514]: 2025-10-08 19:05:01.959 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:05:01 compute-0 nova_compute[117514]: 2025-10-08 19:05:01.959 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:05:02 compute-0 nova_compute[117514]: 2025-10-08 19:05:02.025 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 08 19:05:02 compute-0 nova_compute[117514]: 2025-10-08 19:05:02.026 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 08 19:05:02 compute-0 nova_compute[117514]: 2025-10-08 19:05:02.044 2 DEBUG nova.compute.provider_tree [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 08 19:05:02 compute-0 nova_compute[117514]: 2025-10-08 19:05:02.056 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 08 19:05:02 compute-0 nova_compute[117514]: 2025-10-08 19:05:02.058 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 08 19:05:02 compute-0 nova_compute[117514]: 2025-10-08 19:05:02.058 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.098s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:05:02 compute-0 sshd-session[144281]: Connection closed by invalid user hadoop 193.32.162.151 port 47762 [preauth]
Oct 08 19:05:03 compute-0 nova_compute[117514]: 2025-10-08 19:05:03.057 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:05:03 compute-0 nova_compute[117514]: 2025-10-08 19:05:03.058 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 08 19:05:03 compute-0 nova_compute[117514]: 2025-10-08 19:05:03.059 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 08 19:05:03 compute-0 nova_compute[117514]: 2025-10-08 19:05:03.074 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 08 19:05:03 compute-0 nova_compute[117514]: 2025-10-08 19:05:03.075 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:05:03 compute-0 podman[144283]: 2025-10-08 19:05:03.651724363 +0000 UTC m=+0.063449983 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 08 19:05:03 compute-0 nova_compute[117514]: 2025-10-08 19:05:03.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:05:11 compute-0 podman[144307]: 2025-10-08 19:05:11.677316539 +0000 UTC m=+0.087484758 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm)
Oct 08 19:05:17 compute-0 podman[144328]: 2025-10-08 19:05:17.690786845 +0000 UTC m=+0.088887747 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 08 19:05:17 compute-0 podman[144327]: 2025-10-08 19:05:17.70835938 +0000 UTC m=+0.115414619 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., config_id=edpm, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, io.openshift.tags=minimal rhel9, release=1755695350)
Oct 08 19:05:21 compute-0 podman[144368]: 2025-10-08 19:05:21.666197128 +0000 UTC m=+0.078700313 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct 08 19:05:23 compute-0 podman[144393]: 2025-10-08 19:05:23.678810035 +0000 UTC m=+0.087381226 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.vendor=CentOS, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 08 19:05:23 compute-0 podman[144394]: 2025-10-08 19:05:23.694185125 +0000 UTC m=+0.097390596 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Oct 08 19:05:25 compute-0 podman[144430]: 2025-10-08 19:05:25.703162859 +0000 UTC m=+0.118436485 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct 08 19:05:31 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:31.021 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a6:75:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '5e:14:dd:63:55:2a'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 08 19:05:31 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:31.023 28643 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 08 19:05:34 compute-0 podman[144458]: 2025-10-08 19:05:34.659265754 +0000 UTC m=+0.073058035 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 08 19:05:39 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:39.025 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f81f7a-64d8-418a-a74c-b879bd6deb83, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:05:42 compute-0 podman[144482]: 2025-10-08 19:05:42.679555656 +0000 UTC m=+0.092844989 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct 08 19:05:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:44.226 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:05:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:44.226 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:05:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:44.226 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:05:48 compute-0 podman[144503]: 2025-10-08 19:05:48.676607835 +0000 UTC m=+0.090876773 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, distribution-scope=public, io.openshift.tags=minimal rhel9, release=1755695350, container_name=openstack_network_exporter, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct 08 19:05:48 compute-0 podman[144504]: 2025-10-08 19:05:48.68356069 +0000 UTC m=+0.086806930 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct 08 19:05:50 compute-0 nova_compute[117514]: 2025-10-08 19:05:50.789 2 DEBUG oslo_concurrency.lockutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "533c431a-8ae8-4310-81dc-29285b78f93c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:05:50 compute-0 nova_compute[117514]: 2025-10-08 19:05:50.789 2 DEBUG oslo_concurrency.lockutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "533c431a-8ae8-4310-81dc-29285b78f93c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:05:50 compute-0 nova_compute[117514]: 2025-10-08 19:05:50.811 2 DEBUG nova.compute.manager [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Oct 08 19:05:50 compute-0 nova_compute[117514]: 2025-10-08 19:05:50.929 2 DEBUG oslo_concurrency.lockutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:05:50 compute-0 nova_compute[117514]: 2025-10-08 19:05:50.930 2 DEBUG oslo_concurrency.lockutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:05:50 compute-0 nova_compute[117514]: 2025-10-08 19:05:50.939 2 DEBUG nova.virt.hardware [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Oct 08 19:05:50 compute-0 nova_compute[117514]: 2025-10-08 19:05:50.940 2 INFO nova.compute.claims [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Claim successful on node compute-0.ctlplane.example.com
Oct 08 19:05:51 compute-0 nova_compute[117514]: 2025-10-08 19:05:51.041 2 DEBUG nova.compute.provider_tree [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 08 19:05:51 compute-0 nova_compute[117514]: 2025-10-08 19:05:51.055 2 DEBUG nova.scheduler.client.report [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 08 19:05:51 compute-0 nova_compute[117514]: 2025-10-08 19:05:51.080 2 DEBUG oslo_concurrency.lockutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:05:51 compute-0 nova_compute[117514]: 2025-10-08 19:05:51.081 2 DEBUG nova.compute.manager [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Oct 08 19:05:51 compute-0 nova_compute[117514]: 2025-10-08 19:05:51.125 2 DEBUG nova.compute.manager [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Oct 08 19:05:51 compute-0 nova_compute[117514]: 2025-10-08 19:05:51.126 2 DEBUG nova.network.neutron [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Oct 08 19:05:51 compute-0 nova_compute[117514]: 2025-10-08 19:05:51.162 2 INFO nova.virt.libvirt.driver [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 08 19:05:51 compute-0 nova_compute[117514]: 2025-10-08 19:05:51.185 2 DEBUG nova.compute.manager [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Oct 08 19:05:51 compute-0 nova_compute[117514]: 2025-10-08 19:05:51.271 2 DEBUG nova.compute.manager [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Oct 08 19:05:51 compute-0 nova_compute[117514]: 2025-10-08 19:05:51.273 2 DEBUG nova.virt.libvirt.driver [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Oct 08 19:05:51 compute-0 nova_compute[117514]: 2025-10-08 19:05:51.274 2 INFO nova.virt.libvirt.driver [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Creating image(s)
Oct 08 19:05:51 compute-0 nova_compute[117514]: 2025-10-08 19:05:51.275 2 DEBUG oslo_concurrency.lockutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "/var/lib/nova/instances/533c431a-8ae8-4310-81dc-29285b78f93c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:05:51 compute-0 nova_compute[117514]: 2025-10-08 19:05:51.275 2 DEBUG oslo_concurrency.lockutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "/var/lib/nova/instances/533c431a-8ae8-4310-81dc-29285b78f93c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:05:51 compute-0 nova_compute[117514]: 2025-10-08 19:05:51.276 2 DEBUG oslo_concurrency.lockutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "/var/lib/nova/instances/533c431a-8ae8-4310-81dc-29285b78f93c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:05:51 compute-0 nova_compute[117514]: 2025-10-08 19:05:51.277 2 DEBUG oslo_concurrency.lockutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "008eb3078b811ee47058b7252a820910c35fc6df" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:05:51 compute-0 nova_compute[117514]: 2025-10-08 19:05:51.278 2 DEBUG oslo_concurrency.lockutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:05:52 compute-0 nova_compute[117514]: 2025-10-08 19:05:52.074 2 WARNING oslo_policy.policy [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Oct 08 19:05:52 compute-0 nova_compute[117514]: 2025-10-08 19:05:52.074 2 WARNING oslo_policy.policy [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Oct 08 19:05:52 compute-0 nova_compute[117514]: 2025-10-08 19:05:52.079 2 DEBUG nova.policy [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Oct 08 19:05:52 compute-0 podman[144545]: 2025-10-08 19:05:52.669992856 +0000 UTC m=+0.084613048 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 08 19:05:52 compute-0 nova_compute[117514]: 2025-10-08 19:05:52.683 2 DEBUG oslo_concurrency.processutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:05:52 compute-0 nova_compute[117514]: 2025-10-08 19:05:52.788 2 DEBUG oslo_concurrency.processutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df.part --force-share --output=json" returned: 0 in 0.105s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:05:52 compute-0 nova_compute[117514]: 2025-10-08 19:05:52.789 2 DEBUG nova.virt.images [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] 23cfa426-7011-4566-992d-1c7af39f70dd was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Oct 08 19:05:52 compute-0 nova_compute[117514]: 2025-10-08 19:05:52.792 2 DEBUG nova.privsep.utils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Oct 08 19:05:52 compute-0 nova_compute[117514]: 2025-10-08 19:05:52.793 2 DEBUG oslo_concurrency.processutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df.part /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:05:53 compute-0 nova_compute[117514]: 2025-10-08 19:05:53.049 2 DEBUG oslo_concurrency.processutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df.part /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df.converted" returned: 0 in 0.257s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:05:53 compute-0 nova_compute[117514]: 2025-10-08 19:05:53.054 2 DEBUG oslo_concurrency.processutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:05:53 compute-0 nova_compute[117514]: 2025-10-08 19:05:53.151 2 DEBUG oslo_concurrency.processutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df.converted --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:05:53 compute-0 nova_compute[117514]: 2025-10-08 19:05:53.153 2 DEBUG oslo_concurrency.lockutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.874s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:05:53 compute-0 nova_compute[117514]: 2025-10-08 19:05:53.166 2 INFO oslo.privsep.daemon [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpp8ow9d39/privsep.sock']
Oct 08 19:05:53 compute-0 nova_compute[117514]: 2025-10-08 19:05:53.218 2 DEBUG nova.network.neutron [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Successfully created port: 82f4743a-dcdc-49f7-be61-94d565e29842 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Oct 08 19:05:53 compute-0 nova_compute[117514]: 2025-10-08 19:05:53.900 2 INFO oslo.privsep.daemon [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Spawned new privsep daemon via rootwrap
Oct 08 19:05:53 compute-0 nova_compute[117514]: 2025-10-08 19:05:53.766 54 INFO oslo.privsep.daemon [-] privsep daemon starting
Oct 08 19:05:53 compute-0 nova_compute[117514]: 2025-10-08 19:05:53.770 54 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Oct 08 19:05:53 compute-0 nova_compute[117514]: 2025-10-08 19:05:53.773 54 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Oct 08 19:05:53 compute-0 nova_compute[117514]: 2025-10-08 19:05:53.773 54 INFO oslo.privsep.daemon [-] privsep daemon running as pid 54
Oct 08 19:05:53 compute-0 nova_compute[117514]: 2025-10-08 19:05:53.994 2 DEBUG oslo_concurrency.processutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:05:54 compute-0 nova_compute[117514]: 2025-10-08 19:05:54.047 2 DEBUG oslo_concurrency.processutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:05:54 compute-0 nova_compute[117514]: 2025-10-08 19:05:54.048 2 DEBUG oslo_concurrency.lockutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "008eb3078b811ee47058b7252a820910c35fc6df" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:05:54 compute-0 nova_compute[117514]: 2025-10-08 19:05:54.049 2 DEBUG oslo_concurrency.lockutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:05:54 compute-0 nova_compute[117514]: 2025-10-08 19:05:54.060 2 DEBUG oslo_concurrency.processutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:05:54 compute-0 nova_compute[117514]: 2025-10-08 19:05:54.116 2 DEBUG oslo_concurrency.processutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:05:54 compute-0 nova_compute[117514]: 2025-10-08 19:05:54.117 2 DEBUG oslo_concurrency.processutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df,backing_fmt=raw /var/lib/nova/instances/533c431a-8ae8-4310-81dc-29285b78f93c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:05:54 compute-0 nova_compute[117514]: 2025-10-08 19:05:54.152 2 DEBUG oslo_concurrency.processutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df,backing_fmt=raw /var/lib/nova/instances/533c431a-8ae8-4310-81dc-29285b78f93c/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:05:54 compute-0 nova_compute[117514]: 2025-10-08 19:05:54.153 2 DEBUG oslo_concurrency.lockutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:05:54 compute-0 nova_compute[117514]: 2025-10-08 19:05:54.153 2 DEBUG oslo_concurrency.processutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:05:54 compute-0 nova_compute[117514]: 2025-10-08 19:05:54.210 2 DEBUG oslo_concurrency.processutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:05:54 compute-0 nova_compute[117514]: 2025-10-08 19:05:54.211 2 DEBUG nova.virt.disk.api [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Checking if we can resize image /var/lib/nova/instances/533c431a-8ae8-4310-81dc-29285b78f93c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Oct 08 19:05:54 compute-0 nova_compute[117514]: 2025-10-08 19:05:54.212 2 DEBUG oslo_concurrency.processutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/533c431a-8ae8-4310-81dc-29285b78f93c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:05:54 compute-0 nova_compute[117514]: 2025-10-08 19:05:54.273 2 DEBUG oslo_concurrency.processutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/533c431a-8ae8-4310-81dc-29285b78f93c/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:05:54 compute-0 nova_compute[117514]: 2025-10-08 19:05:54.275 2 DEBUG nova.virt.disk.api [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Cannot resize image /var/lib/nova/instances/533c431a-8ae8-4310-81dc-29285b78f93c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Oct 08 19:05:54 compute-0 nova_compute[117514]: 2025-10-08 19:05:54.275 2 DEBUG nova.objects.instance [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'migration_context' on Instance uuid 533c431a-8ae8-4310-81dc-29285b78f93c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 08 19:05:54 compute-0 nova_compute[117514]: 2025-10-08 19:05:54.291 2 DEBUG nova.virt.libvirt.driver [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Oct 08 19:05:54 compute-0 nova_compute[117514]: 2025-10-08 19:05:54.291 2 DEBUG nova.virt.libvirt.driver [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Ensure instance console log exists: /var/lib/nova/instances/533c431a-8ae8-4310-81dc-29285b78f93c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Oct 08 19:05:54 compute-0 nova_compute[117514]: 2025-10-08 19:05:54.292 2 DEBUG oslo_concurrency.lockutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:05:54 compute-0 nova_compute[117514]: 2025-10-08 19:05:54.292 2 DEBUG oslo_concurrency.lockutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:05:54 compute-0 nova_compute[117514]: 2025-10-08 19:05:54.292 2 DEBUG oslo_concurrency.lockutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:05:54 compute-0 podman[144606]: 2025-10-08 19:05:54.678196449 +0000 UTC m=+0.077215511 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Oct 08 19:05:54 compute-0 podman[144605]: 2025-10-08 19:05:54.694376142 +0000 UTC m=+0.104346851 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 08 19:05:55 compute-0 nova_compute[117514]: 2025-10-08 19:05:55.061 2 DEBUG nova.network.neutron [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Successfully updated port: 82f4743a-dcdc-49f7-be61-94d565e29842 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Oct 08 19:05:55 compute-0 nova_compute[117514]: 2025-10-08 19:05:55.077 2 DEBUG oslo_concurrency.lockutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "refresh_cache-533c431a-8ae8-4310-81dc-29285b78f93c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:05:55 compute-0 nova_compute[117514]: 2025-10-08 19:05:55.077 2 DEBUG oslo_concurrency.lockutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquired lock "refresh_cache-533c431a-8ae8-4310-81dc-29285b78f93c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:05:55 compute-0 nova_compute[117514]: 2025-10-08 19:05:55.078 2 DEBUG nova.network.neutron [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 08 19:05:55 compute-0 nova_compute[117514]: 2025-10-08 19:05:55.240 2 DEBUG nova.network.neutron [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Oct 08 19:05:55 compute-0 nova_compute[117514]: 2025-10-08 19:05:55.564 2 DEBUG nova.compute.manager [req-3474fb1f-ad2c-42e9-b2d3-dd50af38bc26 req-4424d32f-9f5d-4126-85c0-9cd46f5f1538 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Received event network-changed-82f4743a-dcdc-49f7-be61-94d565e29842 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:05:55 compute-0 nova_compute[117514]: 2025-10-08 19:05:55.564 2 DEBUG nova.compute.manager [req-3474fb1f-ad2c-42e9-b2d3-dd50af38bc26 req-4424d32f-9f5d-4126-85c0-9cd46f5f1538 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Refreshing instance network info cache due to event network-changed-82f4743a-dcdc-49f7-be61-94d565e29842. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 08 19:05:55 compute-0 nova_compute[117514]: 2025-10-08 19:05:55.565 2 DEBUG oslo_concurrency.lockutils [req-3474fb1f-ad2c-42e9-b2d3-dd50af38bc26 req-4424d32f-9f5d-4126-85c0-9cd46f5f1538 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-533c431a-8ae8-4310-81dc-29285b78f93c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.018 2 DEBUG nova.network.neutron [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Updating instance_info_cache with network_info: [{"id": "82f4743a-dcdc-49f7-be61-94d565e29842", "address": "fa:16:3e:2e:6b:6c", "network": {"id": "a913b285-6d0a-478e-aa24-18bb458d8f7a", "bridge": "br-int", "label": "tempest-network-smoke--994528417", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82f4743a-dc", "ovs_interfaceid": "82f4743a-dcdc-49f7-be61-94d565e29842", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.051 2 DEBUG oslo_concurrency.lockutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Releasing lock "refresh_cache-533c431a-8ae8-4310-81dc-29285b78f93c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.052 2 DEBUG nova.compute.manager [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Instance network_info: |[{"id": "82f4743a-dcdc-49f7-be61-94d565e29842", "address": "fa:16:3e:2e:6b:6c", "network": {"id": "a913b285-6d0a-478e-aa24-18bb458d8f7a", "bridge": "br-int", "label": "tempest-network-smoke--994528417", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82f4743a-dc", "ovs_interfaceid": "82f4743a-dcdc-49f7-be61-94d565e29842", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Oct 08 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.052 2 DEBUG oslo_concurrency.lockutils [req-3474fb1f-ad2c-42e9-b2d3-dd50af38bc26 req-4424d32f-9f5d-4126-85c0-9cd46f5f1538 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-533c431a-8ae8-4310-81dc-29285b78f93c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.052 2 DEBUG nova.network.neutron [req-3474fb1f-ad2c-42e9-b2d3-dd50af38bc26 req-4424d32f-9f5d-4126-85c0-9cd46f5f1538 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Refreshing network info cache for port 82f4743a-dcdc-49f7-be61-94d565e29842 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 08 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.055 2 DEBUG nova.virt.libvirt.driver [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Start _get_guest_xml network_info=[{"id": "82f4743a-dcdc-49f7-be61-94d565e29842", "address": "fa:16:3e:2e:6b:6c", "network": {"id": "a913b285-6d0a-478e-aa24-18bb458d8f7a", "bridge": "br-int", "label": "tempest-network-smoke--994528417", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82f4743a-dc", "ovs_interfaceid": "82f4743a-dcdc-49f7-be61-94d565e29842", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T19:05:11Z,direct_url=<?>,disk_format='qcow2',id=23cfa426-7011-4566-992d-1c7af39f70dd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0776a2a010754884a7b224f3b08ef53b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T19:05:12Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'guest_format': None, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_options': None, 'image_id': '23cfa426-7011-4566-992d-1c7af39f70dd'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Oct 08 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.069 2 WARNING nova.virt.libvirt.driver [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 08 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.075 2 DEBUG nova.virt.libvirt.host [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Oct 08 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.076 2 DEBUG nova.virt.libvirt.host [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Oct 08 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.079 2 DEBUG nova.virt.libvirt.host [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Oct 08 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.079 2 DEBUG nova.virt.libvirt.host [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Oct 08 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.080 2 DEBUG nova.virt.libvirt.driver [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Oct 08 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.080 2 DEBUG nova.virt.hardware [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T19:05:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='e8a148fc-4419-4813-98ff-a17e2a95609e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T19:05:11Z,direct_url=<?>,disk_format='qcow2',id=23cfa426-7011-4566-992d-1c7af39f70dd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0776a2a010754884a7b224f3b08ef53b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T19:05:12Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Oct 08 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.081 2 DEBUG nova.virt.hardware [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Oct 08 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.081 2 DEBUG nova.virt.hardware [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Oct 08 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.081 2 DEBUG nova.virt.hardware [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Oct 08 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.081 2 DEBUG nova.virt.hardware [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Oct 08 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.082 2 DEBUG nova.virt.hardware [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Oct 08 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.082 2 DEBUG nova.virt.hardware [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Oct 08 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.082 2 DEBUG nova.virt.hardware [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Oct 08 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.082 2 DEBUG nova.virt.hardware [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Oct 08 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.083 2 DEBUG nova.virt.hardware [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Oct 08 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.083 2 DEBUG nova.virt.hardware [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Oct 08 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.126 2 DEBUG nova.privsep.utils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Oct 08 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.128 2 DEBUG nova.virt.libvirt.vif [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T19:05:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-447228763',display_name='tempest-TestNetworkBasicOps-server-447228763',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-447228763',id=1,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEkUPXM3K1FQRSOHUI4ceK1l6cbpFonPXFALKMkZcGgnSoRiUTQsb/Q287ApBX2G3xb2VwfVQAcm0rggAGmL4bEoFJTCQrQCAGh+fp9j7aUYBxWFzZf4Ok3jDCvBVuh0yA==',key_name='tempest-TestNetworkBasicOps-1885837558',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-2r2x09q7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T19:05:51Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=533c431a-8ae8-4310-81dc-29285b78f93c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "82f4743a-dcdc-49f7-be61-94d565e29842", "address": "fa:16:3e:2e:6b:6c", "network": {"id": "a913b285-6d0a-478e-aa24-18bb458d8f7a", "bridge": "br-int", "label": "tempest-network-smoke--994528417", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82f4743a-dc", "ovs_interfaceid": "82f4743a-dcdc-49f7-be61-94d565e29842", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Oct 08 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.129 2 DEBUG nova.network.os_vif_util [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "82f4743a-dcdc-49f7-be61-94d565e29842", "address": "fa:16:3e:2e:6b:6c", "network": {"id": "a913b285-6d0a-478e-aa24-18bb458d8f7a", "bridge": "br-int", "label": "tempest-network-smoke--994528417", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82f4743a-dc", "ovs_interfaceid": "82f4743a-dcdc-49f7-be61-94d565e29842", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 08 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.130 2 DEBUG nova.network.os_vif_util [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2e:6b:6c,bridge_name='br-int',has_traffic_filtering=True,id=82f4743a-dcdc-49f7-be61-94d565e29842,network=Network(a913b285-6d0a-478e-aa24-18bb458d8f7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82f4743a-dc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 08 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.133 2 DEBUG nova.objects.instance [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 533c431a-8ae8-4310-81dc-29285b78f93c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 08 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.151 2 DEBUG nova.virt.libvirt.driver [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] End _get_guest_xml xml=<domain type="kvm">
Oct 08 19:05:56 compute-0 nova_compute[117514]:   <uuid>533c431a-8ae8-4310-81dc-29285b78f93c</uuid>
Oct 08 19:05:56 compute-0 nova_compute[117514]:   <name>instance-00000001</name>
Oct 08 19:05:56 compute-0 nova_compute[117514]:   <memory>131072</memory>
Oct 08 19:05:56 compute-0 nova_compute[117514]:   <vcpu>1</vcpu>
Oct 08 19:05:56 compute-0 nova_compute[117514]:   <metadata>
Oct 08 19:05:56 compute-0 nova_compute[117514]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 08 19:05:56 compute-0 nova_compute[117514]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct 08 19:05:56 compute-0 nova_compute[117514]:       <nova:name>tempest-TestNetworkBasicOps-server-447228763</nova:name>
Oct 08 19:05:56 compute-0 nova_compute[117514]:       <nova:creationTime>2025-10-08 19:05:56</nova:creationTime>
Oct 08 19:05:56 compute-0 nova_compute[117514]:       <nova:flavor name="m1.nano">
Oct 08 19:05:56 compute-0 nova_compute[117514]:         <nova:memory>128</nova:memory>
Oct 08 19:05:56 compute-0 nova_compute[117514]:         <nova:disk>1</nova:disk>
Oct 08 19:05:56 compute-0 nova_compute[117514]:         <nova:swap>0</nova:swap>
Oct 08 19:05:56 compute-0 nova_compute[117514]:         <nova:ephemeral>0</nova:ephemeral>
Oct 08 19:05:56 compute-0 nova_compute[117514]:         <nova:vcpus>1</nova:vcpus>
Oct 08 19:05:56 compute-0 nova_compute[117514]:       </nova:flavor>
Oct 08 19:05:56 compute-0 nova_compute[117514]:       <nova:owner>
Oct 08 19:05:56 compute-0 nova_compute[117514]:         <nova:user uuid="efdb1424acdb478684cdb088b373ba05">tempest-TestNetworkBasicOps-1122149477-project-member</nova:user>
Oct 08 19:05:56 compute-0 nova_compute[117514]:         <nova:project uuid="b7f7c752a9c5498f8eda73e461895ac9">tempest-TestNetworkBasicOps-1122149477</nova:project>
Oct 08 19:05:56 compute-0 nova_compute[117514]:       </nova:owner>
Oct 08 19:05:56 compute-0 nova_compute[117514]:       <nova:root type="image" uuid="23cfa426-7011-4566-992d-1c7af39f70dd"/>
Oct 08 19:05:56 compute-0 nova_compute[117514]:       <nova:ports>
Oct 08 19:05:56 compute-0 nova_compute[117514]:         <nova:port uuid="82f4743a-dcdc-49f7-be61-94d565e29842">
Oct 08 19:05:56 compute-0 nova_compute[117514]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Oct 08 19:05:56 compute-0 nova_compute[117514]:         </nova:port>
Oct 08 19:05:56 compute-0 nova_compute[117514]:       </nova:ports>
Oct 08 19:05:56 compute-0 nova_compute[117514]:     </nova:instance>
Oct 08 19:05:56 compute-0 nova_compute[117514]:   </metadata>
Oct 08 19:05:56 compute-0 nova_compute[117514]:   <sysinfo type="smbios">
Oct 08 19:05:56 compute-0 nova_compute[117514]:     <system>
Oct 08 19:05:56 compute-0 nova_compute[117514]:       <entry name="manufacturer">RDO</entry>
Oct 08 19:05:56 compute-0 nova_compute[117514]:       <entry name="product">OpenStack Compute</entry>
Oct 08 19:05:56 compute-0 nova_compute[117514]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct 08 19:05:56 compute-0 nova_compute[117514]:       <entry name="serial">533c431a-8ae8-4310-81dc-29285b78f93c</entry>
Oct 08 19:05:56 compute-0 nova_compute[117514]:       <entry name="uuid">533c431a-8ae8-4310-81dc-29285b78f93c</entry>
Oct 08 19:05:56 compute-0 nova_compute[117514]:       <entry name="family">Virtual Machine</entry>
Oct 08 19:05:56 compute-0 nova_compute[117514]:     </system>
Oct 08 19:05:56 compute-0 nova_compute[117514]:   </sysinfo>
Oct 08 19:05:56 compute-0 nova_compute[117514]:   <os>
Oct 08 19:05:56 compute-0 nova_compute[117514]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 08 19:05:56 compute-0 nova_compute[117514]:     <boot dev="hd"/>
Oct 08 19:05:56 compute-0 nova_compute[117514]:     <smbios mode="sysinfo"/>
Oct 08 19:05:56 compute-0 nova_compute[117514]:   </os>
Oct 08 19:05:56 compute-0 nova_compute[117514]:   <features>
Oct 08 19:05:56 compute-0 nova_compute[117514]:     <acpi/>
Oct 08 19:05:56 compute-0 nova_compute[117514]:     <apic/>
Oct 08 19:05:56 compute-0 nova_compute[117514]:     <vmcoreinfo/>
Oct 08 19:05:56 compute-0 nova_compute[117514]:   </features>
Oct 08 19:05:56 compute-0 nova_compute[117514]:   <clock offset="utc">
Oct 08 19:05:56 compute-0 nova_compute[117514]:     <timer name="pit" tickpolicy="delay"/>
Oct 08 19:05:56 compute-0 nova_compute[117514]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 08 19:05:56 compute-0 nova_compute[117514]:     <timer name="hpet" present="no"/>
Oct 08 19:05:56 compute-0 nova_compute[117514]:   </clock>
Oct 08 19:05:56 compute-0 nova_compute[117514]:   <cpu mode="host-model" match="exact">
Oct 08 19:05:56 compute-0 nova_compute[117514]:     <topology sockets="1" cores="1" threads="1"/>
Oct 08 19:05:56 compute-0 nova_compute[117514]:   </cpu>
Oct 08 19:05:56 compute-0 nova_compute[117514]:   <devices>
Oct 08 19:05:56 compute-0 nova_compute[117514]:     <disk type="file" device="disk">
Oct 08 19:05:56 compute-0 nova_compute[117514]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 08 19:05:56 compute-0 nova_compute[117514]:       <source file="/var/lib/nova/instances/533c431a-8ae8-4310-81dc-29285b78f93c/disk"/>
Oct 08 19:05:56 compute-0 nova_compute[117514]:       <target dev="vda" bus="virtio"/>
Oct 08 19:05:56 compute-0 nova_compute[117514]:     </disk>
Oct 08 19:05:56 compute-0 nova_compute[117514]:     <disk type="file" device="cdrom">
Oct 08 19:05:56 compute-0 nova_compute[117514]:       <driver name="qemu" type="raw" cache="none"/>
Oct 08 19:05:56 compute-0 nova_compute[117514]:       <source file="/var/lib/nova/instances/533c431a-8ae8-4310-81dc-29285b78f93c/disk.config"/>
Oct 08 19:05:56 compute-0 nova_compute[117514]:       <target dev="sda" bus="sata"/>
Oct 08 19:05:56 compute-0 nova_compute[117514]:     </disk>
Oct 08 19:05:56 compute-0 nova_compute[117514]:     <interface type="ethernet">
Oct 08 19:05:56 compute-0 nova_compute[117514]:       <mac address="fa:16:3e:2e:6b:6c"/>
Oct 08 19:05:56 compute-0 nova_compute[117514]:       <model type="virtio"/>
Oct 08 19:05:56 compute-0 nova_compute[117514]:       <driver name="vhost" rx_queue_size="512"/>
Oct 08 19:05:56 compute-0 nova_compute[117514]:       <mtu size="1442"/>
Oct 08 19:05:56 compute-0 nova_compute[117514]:       <target dev="tap82f4743a-dc"/>
Oct 08 19:05:56 compute-0 nova_compute[117514]:     </interface>
Oct 08 19:05:56 compute-0 nova_compute[117514]:     <serial type="pty">
Oct 08 19:05:56 compute-0 nova_compute[117514]:       <log file="/var/lib/nova/instances/533c431a-8ae8-4310-81dc-29285b78f93c/console.log" append="off"/>
Oct 08 19:05:56 compute-0 nova_compute[117514]:     </serial>
Oct 08 19:05:56 compute-0 nova_compute[117514]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 08 19:05:56 compute-0 nova_compute[117514]:     <video>
Oct 08 19:05:56 compute-0 nova_compute[117514]:       <model type="virtio"/>
Oct 08 19:05:56 compute-0 nova_compute[117514]:     </video>
Oct 08 19:05:56 compute-0 nova_compute[117514]:     <input type="tablet" bus="usb"/>
Oct 08 19:05:56 compute-0 nova_compute[117514]:     <rng model="virtio">
Oct 08 19:05:56 compute-0 nova_compute[117514]:       <backend model="random">/dev/urandom</backend>
Oct 08 19:05:56 compute-0 nova_compute[117514]:     </rng>
Oct 08 19:05:56 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root"/>
Oct 08 19:05:56 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:05:56 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:05:56 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:05:56 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:05:56 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:05:56 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:05:56 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:05:56 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:05:56 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:05:56 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:05:56 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:05:56 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:05:56 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:05:56 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:05:56 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:05:56 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:05:56 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:05:56 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:05:56 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:05:56 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:05:56 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:05:56 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:05:56 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:05:56 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:05:56 compute-0 nova_compute[117514]:     <controller type="usb" index="0"/>
Oct 08 19:05:56 compute-0 nova_compute[117514]:     <memballoon model="virtio">
Oct 08 19:05:56 compute-0 nova_compute[117514]:       <stats period="10"/>
Oct 08 19:05:56 compute-0 nova_compute[117514]:     </memballoon>
Oct 08 19:05:56 compute-0 nova_compute[117514]:   </devices>
Oct 08 19:05:56 compute-0 nova_compute[117514]: </domain>
Oct 08 19:05:56 compute-0 nova_compute[117514]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Oct 08 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.153 2 DEBUG nova.compute.manager [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Preparing to wait for external event network-vif-plugged-82f4743a-dcdc-49f7-be61-94d565e29842 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Oct 08 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.154 2 DEBUG oslo_concurrency.lockutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "533c431a-8ae8-4310-81dc-29285b78f93c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.154 2 DEBUG oslo_concurrency.lockutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "533c431a-8ae8-4310-81dc-29285b78f93c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.155 2 DEBUG oslo_concurrency.lockutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "533c431a-8ae8-4310-81dc-29285b78f93c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.156 2 DEBUG nova.virt.libvirt.vif [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T19:05:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-447228763',display_name='tempest-TestNetworkBasicOps-server-447228763',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-447228763',id=1,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEkUPXM3K1FQRSOHUI4ceK1l6cbpFonPXFALKMkZcGgnSoRiUTQsb/Q287ApBX2G3xb2VwfVQAcm0rggAGmL4bEoFJTCQrQCAGh+fp9j7aUYBxWFzZf4Ok3jDCvBVuh0yA==',key_name='tempest-TestNetworkBasicOps-1885837558',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-2r2x09q7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T19:05:51Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=533c431a-8ae8-4310-81dc-29285b78f93c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "82f4743a-dcdc-49f7-be61-94d565e29842", "address": "fa:16:3e:2e:6b:6c", "network": {"id": "a913b285-6d0a-478e-aa24-18bb458d8f7a", "bridge": "br-int", "label": "tempest-network-smoke--994528417", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82f4743a-dc", "ovs_interfaceid": "82f4743a-dcdc-49f7-be61-94d565e29842", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Oct 08 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.157 2 DEBUG nova.network.os_vif_util [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "82f4743a-dcdc-49f7-be61-94d565e29842", "address": "fa:16:3e:2e:6b:6c", "network": {"id": "a913b285-6d0a-478e-aa24-18bb458d8f7a", "bridge": "br-int", "label": "tempest-network-smoke--994528417", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82f4743a-dc", "ovs_interfaceid": "82f4743a-dcdc-49f7-be61-94d565e29842", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 08 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.158 2 DEBUG nova.network.os_vif_util [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2e:6b:6c,bridge_name='br-int',has_traffic_filtering=True,id=82f4743a-dcdc-49f7-be61-94d565e29842,network=Network(a913b285-6d0a-478e-aa24-18bb458d8f7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82f4743a-dc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 08 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.158 2 DEBUG os_vif [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2e:6b:6c,bridge_name='br-int',has_traffic_filtering=True,id=82f4743a-dcdc-49f7-be61-94d565e29842,network=Network(a913b285-6d0a-478e-aa24-18bb458d8f7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82f4743a-dc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Oct 08 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.298 2 DEBUG ovsdbapp.backend.ovs_idl [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 08 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.299 2 DEBUG ovsdbapp.backend.ovs_idl [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 08 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.299 2 DEBUG ovsdbapp.backend.ovs_idl [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 08 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.300 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 08 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.300 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [POLLOUT] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.300 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 08 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.320 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.320 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.321 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 08 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.321 2 INFO oslo.privsep.daemon [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpctdqzjyn/privsep.sock']
Oct 08 19:05:56 compute-0 podman[144648]: 2025-10-08 19:05:56.691418102 +0000 UTC m=+0.113696422 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 08 19:05:57 compute-0 nova_compute[117514]: 2025-10-08 19:05:57.043 2 INFO oslo.privsep.daemon [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Spawned new privsep daemon via rootwrap
Oct 08 19:05:57 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.907 75 INFO oslo.privsep.daemon [-] privsep daemon starting
Oct 08 19:05:57 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.911 75 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Oct 08 19:05:57 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.914 75 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Oct 08 19:05:57 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.914 75 INFO oslo.privsep.daemon [-] privsep daemon running as pid 75
Oct 08 19:05:57 compute-0 nova_compute[117514]: 2025-10-08 19:05:57.247 2 DEBUG nova.network.neutron [req-3474fb1f-ad2c-42e9-b2d3-dd50af38bc26 req-4424d32f-9f5d-4126-85c0-9cd46f5f1538 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Updated VIF entry in instance network info cache for port 82f4743a-dcdc-49f7-be61-94d565e29842. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 08 19:05:57 compute-0 nova_compute[117514]: 2025-10-08 19:05:57.248 2 DEBUG nova.network.neutron [req-3474fb1f-ad2c-42e9-b2d3-dd50af38bc26 req-4424d32f-9f5d-4126-85c0-9cd46f5f1538 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Updating instance_info_cache with network_info: [{"id": "82f4743a-dcdc-49f7-be61-94d565e29842", "address": "fa:16:3e:2e:6b:6c", "network": {"id": "a913b285-6d0a-478e-aa24-18bb458d8f7a", "bridge": "br-int", "label": "tempest-network-smoke--994528417", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82f4743a-dc", "ovs_interfaceid": "82f4743a-dcdc-49f7-be61-94d565e29842", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:05:57 compute-0 nova_compute[117514]: 2025-10-08 19:05:57.266 2 DEBUG oslo_concurrency.lockutils [req-3474fb1f-ad2c-42e9-b2d3-dd50af38bc26 req-4424d32f-9f5d-4126-85c0-9cd46f5f1538 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-533c431a-8ae8-4310-81dc-29285b78f93c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:05:57 compute-0 nova_compute[117514]: 2025-10-08 19:05:57.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:05:57 compute-0 nova_compute[117514]: 2025-10-08 19:05:57.363 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap82f4743a-dc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:05:57 compute-0 nova_compute[117514]: 2025-10-08 19:05:57.364 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap82f4743a-dc, col_values=(('external_ids', {'iface-id': '82f4743a-dcdc-49f7-be61-94d565e29842', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2e:6b:6c', 'vm-uuid': '533c431a-8ae8-4310-81dc-29285b78f93c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:05:57 compute-0 NetworkManager[1035]: <info>  [1759950357.3670] manager: (tap82f4743a-dc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/21)
Oct 08 19:05:57 compute-0 nova_compute[117514]: 2025-10-08 19:05:57.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 08 19:05:57 compute-0 nova_compute[117514]: 2025-10-08 19:05:57.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:05:57 compute-0 nova_compute[117514]: 2025-10-08 19:05:57.375 2 INFO os_vif [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2e:6b:6c,bridge_name='br-int',has_traffic_filtering=True,id=82f4743a-dcdc-49f7-be61-94d565e29842,network=Network(a913b285-6d0a-478e-aa24-18bb458d8f7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82f4743a-dc')
Oct 08 19:05:57 compute-0 nova_compute[117514]: 2025-10-08 19:05:57.414 2 DEBUG nova.virt.libvirt.driver [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 08 19:05:57 compute-0 nova_compute[117514]: 2025-10-08 19:05:57.415 2 DEBUG nova.virt.libvirt.driver [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 08 19:05:57 compute-0 nova_compute[117514]: 2025-10-08 19:05:57.415 2 DEBUG nova.virt.libvirt.driver [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No VIF found with MAC fa:16:3e:2e:6b:6c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Oct 08 19:05:57 compute-0 nova_compute[117514]: 2025-10-08 19:05:57.415 2 INFO nova.virt.libvirt.driver [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Using config drive
Oct 08 19:05:57 compute-0 nova_compute[117514]: 2025-10-08 19:05:57.833 2 INFO nova.virt.libvirt.driver [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Creating config drive at /var/lib/nova/instances/533c431a-8ae8-4310-81dc-29285b78f93c/disk.config
Oct 08 19:05:57 compute-0 nova_compute[117514]: 2025-10-08 19:05:57.842 2 DEBUG oslo_concurrency.processutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/533c431a-8ae8-4310-81dc-29285b78f93c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp69xewfxk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:05:57 compute-0 nova_compute[117514]: 2025-10-08 19:05:57.993 2 DEBUG oslo_concurrency.processutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/533c431a-8ae8-4310-81dc-29285b78f93c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp69xewfxk" returned: 0 in 0.151s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:05:58 compute-0 kernel: tun: Universal TUN/TAP device driver, 1.6
Oct 08 19:05:58 compute-0 kernel: tap82f4743a-dc: entered promiscuous mode
Oct 08 19:05:58 compute-0 NetworkManager[1035]: <info>  [1759950358.0815] manager: (tap82f4743a-dc): new Tun device (/org/freedesktop/NetworkManager/Devices/22)
Oct 08 19:05:58 compute-0 ovn_controller[19759]: 2025-10-08T19:05:58Z|00027|binding|INFO|Claiming lport 82f4743a-dcdc-49f7-be61-94d565e29842 for this chassis.
Oct 08 19:05:58 compute-0 nova_compute[117514]: 2025-10-08 19:05:58.084 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:05:58 compute-0 ovn_controller[19759]: 2025-10-08T19:05:58Z|00028|binding|INFO|82f4743a-dcdc-49f7-be61-94d565e29842: Claiming fa:16:3e:2e:6b:6c 10.100.0.3
Oct 08 19:05:58 compute-0 nova_compute[117514]: 2025-10-08 19:05:58.086 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:05:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:58.098 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2e:6b:6c 10.100.0.3'], port_security=['fa:16:3e:2e:6b:6c 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a913b285-6d0a-478e-aa24-18bb458d8f7a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd3706646-002b-4286-ab41-a86fd84e3356', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=30f96b84-f723-4541-a1ae-463e873ff4a9, chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=82f4743a-dcdc-49f7-be61-94d565e29842) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 08 19:05:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:58.100 28643 INFO neutron.agent.ovn.metadata.agent [-] Port 82f4743a-dcdc-49f7-be61-94d565e29842 in datapath a913b285-6d0a-478e-aa24-18bb458d8f7a bound to our chassis
Oct 08 19:05:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:58.101 28643 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a913b285-6d0a-478e-aa24-18bb458d8f7a
Oct 08 19:05:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:58.102 28643 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpwcgl_2bs/privsep.sock']
Oct 08 19:05:58 compute-0 systemd-udevd[144702]: Network interface NamePolicy= disabled on kernel command line.
Oct 08 19:05:58 compute-0 NetworkManager[1035]: <info>  [1759950358.1261] device (tap82f4743a-dc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 08 19:05:58 compute-0 NetworkManager[1035]: <info>  [1759950358.1267] device (tap82f4743a-dc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 08 19:05:58 compute-0 systemd-machined[77568]: New machine qemu-1-instance-00000001.
Oct 08 19:05:58 compute-0 nova_compute[117514]: 2025-10-08 19:05:58.175 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:05:58 compute-0 ovn_controller[19759]: 2025-10-08T19:05:58Z|00029|binding|INFO|Setting lport 82f4743a-dcdc-49f7-be61-94d565e29842 ovn-installed in OVS
Oct 08 19:05:58 compute-0 ovn_controller[19759]: 2025-10-08T19:05:58Z|00030|binding|INFO|Setting lport 82f4743a-dcdc-49f7-be61-94d565e29842 up in Southbound
Oct 08 19:05:58 compute-0 nova_compute[117514]: 2025-10-08 19:05:58.180 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:05:58 compute-0 systemd[1]: Started Virtual Machine qemu-1-instance-00000001.
Oct 08 19:05:58 compute-0 nova_compute[117514]: 2025-10-08 19:05:58.387 2 DEBUG nova.compute.manager [req-94b4723a-a5e7-4f69-96ff-b852dc67af3e req-c17741c4-e8d4-457a-b540-cb8317abaf1b bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Received event network-vif-plugged-82f4743a-dcdc-49f7-be61-94d565e29842 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:05:58 compute-0 nova_compute[117514]: 2025-10-08 19:05:58.387 2 DEBUG oslo_concurrency.lockutils [req-94b4723a-a5e7-4f69-96ff-b852dc67af3e req-c17741c4-e8d4-457a-b540-cb8317abaf1b bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "533c431a-8ae8-4310-81dc-29285b78f93c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:05:58 compute-0 nova_compute[117514]: 2025-10-08 19:05:58.388 2 DEBUG oslo_concurrency.lockutils [req-94b4723a-a5e7-4f69-96ff-b852dc67af3e req-c17741c4-e8d4-457a-b540-cb8317abaf1b bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "533c431a-8ae8-4310-81dc-29285b78f93c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:05:58 compute-0 nova_compute[117514]: 2025-10-08 19:05:58.388 2 DEBUG oslo_concurrency.lockutils [req-94b4723a-a5e7-4f69-96ff-b852dc67af3e req-c17741c4-e8d4-457a-b540-cb8317abaf1b bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "533c431a-8ae8-4310-81dc-29285b78f93c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:05:58 compute-0 nova_compute[117514]: 2025-10-08 19:05:58.388 2 DEBUG nova.compute.manager [req-94b4723a-a5e7-4f69-96ff-b852dc67af3e req-c17741c4-e8d4-457a-b540-cb8317abaf1b bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Processing event network-vif-plugged-82f4743a-dcdc-49f7-be61-94d565e29842 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Oct 08 19:05:58 compute-0 nova_compute[117514]: 2025-10-08 19:05:58.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:05:58 compute-0 nova_compute[117514]: 2025-10-08 19:05:58.717 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Oct 08 19:05:58 compute-0 nova_compute[117514]: 2025-10-08 19:05:58.735 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Oct 08 19:05:58 compute-0 nova_compute[117514]: 2025-10-08 19:05:58.735 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:05:58 compute-0 nova_compute[117514]: 2025-10-08 19:05:58.735 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Oct 08 19:05:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:58.739 28643 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Oct 08 19:05:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:58.740 28643 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpwcgl_2bs/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Oct 08 19:05:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:58.610 144726 INFO oslo.privsep.daemon [-] privsep daemon starting
Oct 08 19:05:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:58.617 144726 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Oct 08 19:05:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:58.620 144726 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Oct 08 19:05:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:58.621 144726 INFO oslo.privsep.daemon [-] privsep daemon running as pid 144726
Oct 08 19:05:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:58.743 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[78f73a4d-f8c8-4ebb-b6ce-e1bdd1ee2abc]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:05:58 compute-0 nova_compute[117514]: 2025-10-08 19:05:58.746 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:05:59 compute-0 nova_compute[117514]: 2025-10-08 19:05:59.013 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950359.0127413, 533c431a-8ae8-4310-81dc-29285b78f93c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 08 19:05:59 compute-0 nova_compute[117514]: 2025-10-08 19:05:59.013 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] VM Started (Lifecycle Event)
Oct 08 19:05:59 compute-0 nova_compute[117514]: 2025-10-08 19:05:59.015 2 DEBUG nova.compute.manager [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Oct 08 19:05:59 compute-0 nova_compute[117514]: 2025-10-08 19:05:59.029 2 DEBUG nova.virt.libvirt.driver [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Oct 08 19:05:59 compute-0 nova_compute[117514]: 2025-10-08 19:05:59.033 2 INFO nova.virt.libvirt.driver [-] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Instance spawned successfully.
Oct 08 19:05:59 compute-0 nova_compute[117514]: 2025-10-08 19:05:59.033 2 DEBUG nova.virt.libvirt.driver [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Oct 08 19:05:59 compute-0 nova_compute[117514]: 2025-10-08 19:05:59.097 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:05:59 compute-0 nova_compute[117514]: 2025-10-08 19:05:59.102 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 08 19:05:59 compute-0 nova_compute[117514]: 2025-10-08 19:05:59.129 2 DEBUG nova.virt.libvirt.driver [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:05:59 compute-0 nova_compute[117514]: 2025-10-08 19:05:59.129 2 DEBUG nova.virt.libvirt.driver [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:05:59 compute-0 nova_compute[117514]: 2025-10-08 19:05:59.130 2 DEBUG nova.virt.libvirt.driver [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:05:59 compute-0 nova_compute[117514]: 2025-10-08 19:05:59.131 2 DEBUG nova.virt.libvirt.driver [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:05:59 compute-0 nova_compute[117514]: 2025-10-08 19:05:59.132 2 DEBUG nova.virt.libvirt.driver [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:05:59 compute-0 nova_compute[117514]: 2025-10-08 19:05:59.133 2 DEBUG nova.virt.libvirt.driver [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:05:59 compute-0 nova_compute[117514]: 2025-10-08 19:05:59.140 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 08 19:05:59 compute-0 nova_compute[117514]: 2025-10-08 19:05:59.140 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950359.0141673, 533c431a-8ae8-4310-81dc-29285b78f93c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 08 19:05:59 compute-0 nova_compute[117514]: 2025-10-08 19:05:59.141 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] VM Paused (Lifecycle Event)
Oct 08 19:05:59 compute-0 nova_compute[117514]: 2025-10-08 19:05:59.205 2 INFO nova.compute.manager [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Took 7.93 seconds to spawn the instance on the hypervisor.
Oct 08 19:05:59 compute-0 nova_compute[117514]: 2025-10-08 19:05:59.206 2 DEBUG nova.compute.manager [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:05:59 compute-0 nova_compute[117514]: 2025-10-08 19:05:59.220 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:05:59 compute-0 nova_compute[117514]: 2025-10-08 19:05:59.223 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950359.0185692, 533c431a-8ae8-4310-81dc-29285b78f93c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 08 19:05:59 compute-0 nova_compute[117514]: 2025-10-08 19:05:59.224 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] VM Resumed (Lifecycle Event)
Oct 08 19:05:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:59.232 144726 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:05:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:59.232 144726 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:05:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:59.232 144726 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:05:59 compute-0 nova_compute[117514]: 2025-10-08 19:05:59.260 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:05:59 compute-0 nova_compute[117514]: 2025-10-08 19:05:59.264 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 08 19:05:59 compute-0 nova_compute[117514]: 2025-10-08 19:05:59.300 2 INFO nova.compute.manager [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Took 8.41 seconds to build instance.
Oct 08 19:05:59 compute-0 nova_compute[117514]: 2025-10-08 19:05:59.323 2 DEBUG oslo_concurrency.lockutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "533c431a-8ae8-4310-81dc-29285b78f93c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.534s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:05:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:59.812 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[ddfb92a8-9f26-449d-ba87-3b1487f8ed33]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:05:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:59.813 28643 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa913b285-61 in ovnmeta-a913b285-6d0a-478e-aa24-18bb458d8f7a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Oct 08 19:05:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:59.816 144726 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa913b285-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Oct 08 19:05:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:59.816 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[166982ad-869b-4c00-af73-6c6ef338cb43]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:05:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:59.819 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[ff2fa2c9-beb7-44d6-a193-5cbcbde32d42]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:05:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:59.853 28783 DEBUG oslo.privsep.daemon [-] privsep: reply[7138f92a-e9f9-443c-80aa-19c154659748]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:05:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:59.884 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[3cebb73c-de61-4adb-8703-ac0a7d54f88a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:05:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:59.887 28643 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmp23wxy_tt/privsep.sock']
Oct 08 19:06:00 compute-0 nova_compute[117514]: 2025-10-08 19:06:00.304 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:06:00 compute-0 nova_compute[117514]: 2025-10-08 19:06:00.509 2 DEBUG nova.compute.manager [req-e62b17cc-b9bf-4c77-8efe-65522aed82ad req-a53561a2-a5ac-4a90-80e4-9837d516abf8 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Received event network-vif-plugged-82f4743a-dcdc-49f7-be61-94d565e29842 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:06:00 compute-0 nova_compute[117514]: 2025-10-08 19:06:00.510 2 DEBUG oslo_concurrency.lockutils [req-e62b17cc-b9bf-4c77-8efe-65522aed82ad req-a53561a2-a5ac-4a90-80e4-9837d516abf8 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "533c431a-8ae8-4310-81dc-29285b78f93c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:06:00 compute-0 nova_compute[117514]: 2025-10-08 19:06:00.510 2 DEBUG oslo_concurrency.lockutils [req-e62b17cc-b9bf-4c77-8efe-65522aed82ad req-a53561a2-a5ac-4a90-80e4-9837d516abf8 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "533c431a-8ae8-4310-81dc-29285b78f93c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:06:00 compute-0 nova_compute[117514]: 2025-10-08 19:06:00.511 2 DEBUG oslo_concurrency.lockutils [req-e62b17cc-b9bf-4c77-8efe-65522aed82ad req-a53561a2-a5ac-4a90-80e4-9837d516abf8 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "533c431a-8ae8-4310-81dc-29285b78f93c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:06:00 compute-0 nova_compute[117514]: 2025-10-08 19:06:00.511 2 DEBUG nova.compute.manager [req-e62b17cc-b9bf-4c77-8efe-65522aed82ad req-a53561a2-a5ac-4a90-80e4-9837d516abf8 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] No waiting events found dispatching network-vif-plugged-82f4743a-dcdc-49f7-be61-94d565e29842 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 08 19:06:00 compute-0 nova_compute[117514]: 2025-10-08 19:06:00.511 2 WARNING nova.compute.manager [req-e62b17cc-b9bf-4c77-8efe-65522aed82ad req-a53561a2-a5ac-4a90-80e4-9837d516abf8 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Received unexpected event network-vif-plugged-82f4743a-dcdc-49f7-be61-94d565e29842 for instance with vm_state active and task_state None.
Oct 08 19:06:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:00.553 28643 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Oct 08 19:06:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:00.555 28643 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp23wxy_tt/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Oct 08 19:06:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:00.437 144740 INFO oslo.privsep.daemon [-] privsep daemon starting
Oct 08 19:06:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:00.444 144740 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Oct 08 19:06:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:00.448 144740 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Oct 08 19:06:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:00.448 144740 INFO oslo.privsep.daemon [-] privsep daemon running as pid 144740
Oct 08 19:06:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:00.559 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[9abbe92c-4ba0-452f-81a1-7c732f80de13]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:06:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:01.011 144740 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:06:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:01.011 144740 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:06:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:01.011 144740 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:06:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:01.576 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[82cc5cfe-37d3-4b21-8041-7c6ce15ab75b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:06:01 compute-0 NetworkManager[1035]: <info>  [1759950361.5904] manager: (tapa913b285-60): new Veth device (/org/freedesktop/NetworkManager/Devices/23)
Oct 08 19:06:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:01.589 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[faedad42-74a4-4c5a-93a6-6307cdc51eee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:06:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:01.631 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[e4e5d935-148f-4d34-a3cb-779ef932288b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:06:01 compute-0 systemd-udevd[144750]: Network interface NamePolicy= disabled on kernel command line.
Oct 08 19:06:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:01.640 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[d229c2a9-e94f-46b6-a0d4-7a3e564d0431]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:06:01 compute-0 NetworkManager[1035]: <info>  [1759950361.6674] device (tapa913b285-60): carrier: link connected
Oct 08 19:06:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:01.678 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[666618cc-f3fd-4286-b318-0797001b7619]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:06:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:01.706 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[314a3fb3-9ac6-4024-b2e3-7734946d5a73]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa913b285-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ff:f1:09'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 103224, 'reachable_time': 19301, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 144769, 'error': None, 'target': 'ovnmeta-a913b285-6d0a-478e-aa24-18bb458d8f7a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:06:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:01.730 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[c5d49245-da07-447f-b2df-6f83ad79f103]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feff:f109'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 103224, 'tstamp': 103224}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 144770, 'error': None, 'target': 'ovnmeta-a913b285-6d0a-478e-aa24-18bb458d8f7a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:06:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:01.755 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[b2365eeb-9c5e-4946-b634-bf3512b69c67]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa913b285-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ff:f1:09'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 103224, 'reachable_time': 19301, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 144771, 'error': None, 'target': 'ovnmeta-a913b285-6d0a-478e-aa24-18bb458d8f7a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:06:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:01.798 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[409dd171-6139-4fd1-8097-ce59da49e903]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:06:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:01.890 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[4c9eb95a-f153-4ada-b626-a7b245c976ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:06:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:01.893 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa913b285-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:06:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:01.893 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 08 19:06:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:01.894 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa913b285-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:06:01 compute-0 nova_compute[117514]: 2025-10-08 19:06:01.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:06:01 compute-0 kernel: tapa913b285-60: entered promiscuous mode
Oct 08 19:06:01 compute-0 NetworkManager[1035]: <info>  [1759950361.8989] manager: (tapa913b285-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/24)
Oct 08 19:06:01 compute-0 nova_compute[117514]: 2025-10-08 19:06:01.901 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:06:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:01.903 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa913b285-60, col_values=(('external_ids', {'iface-id': 'f9878aab-28ef-456a-a43a-7cacc2381b1f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:06:01 compute-0 nova_compute[117514]: 2025-10-08 19:06:01.904 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:06:01 compute-0 ovn_controller[19759]: 2025-10-08T19:06:01Z|00031|binding|INFO|Releasing lport f9878aab-28ef-456a-a43a-7cacc2381b1f from this chassis (sb_readonly=0)
Oct 08 19:06:01 compute-0 nova_compute[117514]: 2025-10-08 19:06:01.907 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:06:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:01.908 28643 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a913b285-6d0a-478e-aa24-18bb458d8f7a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a913b285-6d0a-478e-aa24-18bb458d8f7a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Oct 08 19:06:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:01.910 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[49711289-52ee-443c-befd-8fab8565e175]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:06:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:01.911 28643 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct 08 19:06:01 compute-0 ovn_metadata_agent[28637]: global
Oct 08 19:06:01 compute-0 ovn_metadata_agent[28637]:     log         /dev/log local0 debug
Oct 08 19:06:01 compute-0 ovn_metadata_agent[28637]:     log-tag     haproxy-metadata-proxy-a913b285-6d0a-478e-aa24-18bb458d8f7a
Oct 08 19:06:01 compute-0 ovn_metadata_agent[28637]:     user        root
Oct 08 19:06:01 compute-0 ovn_metadata_agent[28637]:     group       root
Oct 08 19:06:01 compute-0 ovn_metadata_agent[28637]:     maxconn     1024
Oct 08 19:06:01 compute-0 ovn_metadata_agent[28637]:     pidfile     /var/lib/neutron/external/pids/a913b285-6d0a-478e-aa24-18bb458d8f7a.pid.haproxy
Oct 08 19:06:01 compute-0 ovn_metadata_agent[28637]:     daemon
Oct 08 19:06:01 compute-0 ovn_metadata_agent[28637]: 
Oct 08 19:06:01 compute-0 ovn_metadata_agent[28637]: defaults
Oct 08 19:06:01 compute-0 ovn_metadata_agent[28637]:     log global
Oct 08 19:06:01 compute-0 ovn_metadata_agent[28637]:     mode http
Oct 08 19:06:01 compute-0 ovn_metadata_agent[28637]:     option httplog
Oct 08 19:06:01 compute-0 ovn_metadata_agent[28637]:     option dontlognull
Oct 08 19:06:01 compute-0 ovn_metadata_agent[28637]:     option http-server-close
Oct 08 19:06:01 compute-0 ovn_metadata_agent[28637]:     option forwardfor
Oct 08 19:06:01 compute-0 ovn_metadata_agent[28637]:     retries                 3
Oct 08 19:06:01 compute-0 ovn_metadata_agent[28637]:     timeout http-request    30s
Oct 08 19:06:01 compute-0 ovn_metadata_agent[28637]:     timeout connect         30s
Oct 08 19:06:01 compute-0 ovn_metadata_agent[28637]:     timeout client          32s
Oct 08 19:06:01 compute-0 ovn_metadata_agent[28637]:     timeout server          32s
Oct 08 19:06:01 compute-0 ovn_metadata_agent[28637]:     timeout http-keep-alive 30s
Oct 08 19:06:01 compute-0 ovn_metadata_agent[28637]: 
Oct 08 19:06:01 compute-0 ovn_metadata_agent[28637]: 
Oct 08 19:06:01 compute-0 ovn_metadata_agent[28637]: listen listener
Oct 08 19:06:01 compute-0 ovn_metadata_agent[28637]:     bind 169.254.169.254:80
Oct 08 19:06:01 compute-0 ovn_metadata_agent[28637]:     server metadata /var/lib/neutron/metadata_proxy
Oct 08 19:06:01 compute-0 ovn_metadata_agent[28637]:     http-request add-header X-OVN-Network-ID a913b285-6d0a-478e-aa24-18bb458d8f7a
Oct 08 19:06:01 compute-0 ovn_metadata_agent[28637]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Oct 08 19:06:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:01.913 28643 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a913b285-6d0a-478e-aa24-18bb458d8f7a', 'env', 'PROCESS_TAG=haproxy-a913b285-6d0a-478e-aa24-18bb458d8f7a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a913b285-6d0a-478e-aa24-18bb458d8f7a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Oct 08 19:06:01 compute-0 nova_compute[117514]: 2025-10-08 19:06:01.933 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:06:02 compute-0 podman[144804]: 2025-10-08 19:06:02.359540899 +0000 UTC m=+0.066390548 container create 3efae6d2598078a444d1d0b5df7fb7ce2c474b330f618d9a2595c2a8e415d9fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-a913b285-6d0a-478e-aa24-18bb458d8f7a, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct 08 19:06:02 compute-0 nova_compute[117514]: 2025-10-08 19:06:02.366 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:06:02 compute-0 systemd[1]: Started libpod-conmon-3efae6d2598078a444d1d0b5df7fb7ce2c474b330f618d9a2595c2a8e415d9fa.scope.
Oct 08 19:06:02 compute-0 podman[144804]: 2025-10-08 19:06:02.32489971 +0000 UTC m=+0.031749399 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7
Oct 08 19:06:02 compute-0 systemd[1]: Started libcrun container.
Oct 08 19:06:02 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d18ad23441d9b789fc1d272bc7da5f2175a2c9c00ba472ac96b05443d421d00c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 08 19:06:02 compute-0 podman[144804]: 2025-10-08 19:06:02.475930446 +0000 UTC m=+0.182780125 container init 3efae6d2598078a444d1d0b5df7fb7ce2c474b330f618d9a2595c2a8e415d9fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-a913b285-6d0a-478e-aa24-18bb458d8f7a, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team)
Oct 08 19:06:02 compute-0 podman[144804]: 2025-10-08 19:06:02.482671304 +0000 UTC m=+0.189520953 container start 3efae6d2598078a444d1d0b5df7fb7ce2c474b330f618d9a2595c2a8e415d9fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-a913b285-6d0a-478e-aa24-18bb458d8f7a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true)
Oct 08 19:06:02 compute-0 neutron-haproxy-ovnmeta-a913b285-6d0a-478e-aa24-18bb458d8f7a[144818]: [NOTICE]   (144822) : New worker (144824) forked
Oct 08 19:06:02 compute-0 neutron-haproxy-ovnmeta-a913b285-6d0a-478e-aa24-18bb458d8f7a[144818]: [NOTICE]   (144822) : Loading success.
Oct 08 19:06:02 compute-0 nova_compute[117514]: 2025-10-08 19:06:02.750 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:06:02 compute-0 nova_compute[117514]: 2025-10-08 19:06:02.753 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:06:02 compute-0 nova_compute[117514]: 2025-10-08 19:06:02.753 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 08 19:06:02 compute-0 nova_compute[117514]: 2025-10-08 19:06:02.753 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 08 19:06:03 compute-0 nova_compute[117514]: 2025-10-08 19:06:03.024 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "refresh_cache-533c431a-8ae8-4310-81dc-29285b78f93c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:06:03 compute-0 nova_compute[117514]: 2025-10-08 19:06:03.027 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquired lock "refresh_cache-533c431a-8ae8-4310-81dc-29285b78f93c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:06:03 compute-0 nova_compute[117514]: 2025-10-08 19:06:03.028 2 DEBUG nova.network.neutron [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Oct 08 19:06:03 compute-0 nova_compute[117514]: 2025-10-08 19:06:03.029 2 DEBUG nova.objects.instance [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 533c431a-8ae8-4310-81dc-29285b78f93c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 08 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.076 2 DEBUG nova.network.neutron [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Updating instance_info_cache with network_info: [{"id": "82f4743a-dcdc-49f7-be61-94d565e29842", "address": "fa:16:3e:2e:6b:6c", "network": {"id": "a913b285-6d0a-478e-aa24-18bb458d8f7a", "bridge": "br-int", "label": "tempest-network-smoke--994528417", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82f4743a-dc", "ovs_interfaceid": "82f4743a-dcdc-49f7-be61-94d565e29842", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.097 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Releasing lock "refresh_cache-533c431a-8ae8-4310-81dc-29285b78f93c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.098 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Oct 08 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.100 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.101 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.101 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.105 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.106 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.106 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.107 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 08 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.108 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.132 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.133 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.134 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.135 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 08 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.212 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/533c431a-8ae8-4310-81dc-29285b78f93c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:06:05 compute-0 ovn_controller[19759]: 2025-10-08T19:06:05Z|00032|binding|INFO|Releasing lport f9878aab-28ef-456a-a43a-7cacc2381b1f from this chassis (sb_readonly=0)
Oct 08 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:06:05 compute-0 NetworkManager[1035]: <info>  [1759950365.2485] manager: (patch-provnet-64c51c9c-a066-44c7-bc3d-9c8bcfc2a465-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/25)
Oct 08 19:06:05 compute-0 NetworkManager[1035]: <info>  [1759950365.2501] device (patch-provnet-64c51c9c-a066-44c7-bc3d-9c8bcfc2a465-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 08 19:06:05 compute-0 NetworkManager[1035]: <info>  [1759950365.2570] manager: (patch-br-int-to-provnet-64c51c9c-a066-44c7-bc3d-9c8bcfc2a465): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/26)
Oct 08 19:06:05 compute-0 NetworkManager[1035]: <info>  [1759950365.2577] device (patch-br-int-to-provnet-64c51c9c-a066-44c7-bc3d-9c8bcfc2a465)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 08 19:06:05 compute-0 NetworkManager[1035]: <info>  [1759950365.2591] manager: (patch-br-int-to-provnet-64c51c9c-a066-44c7-bc3d-9c8bcfc2a465): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/27)
Oct 08 19:06:05 compute-0 NetworkManager[1035]: <info>  [1759950365.2599] manager: (patch-provnet-64c51c9c-a066-44c7-bc3d-9c8bcfc2a465-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/28)
Oct 08 19:06:05 compute-0 NetworkManager[1035]: <info>  [1759950365.2605] device (patch-provnet-64c51c9c-a066-44c7-bc3d-9c8bcfc2a465-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Oct 08 19:06:05 compute-0 NetworkManager[1035]: <info>  [1759950365.2609] device (patch-br-int-to-provnet-64c51c9c-a066-44c7-bc3d-9c8bcfc2a465)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Oct 08 19:06:05 compute-0 podman[144834]: 2025-10-08 19:06:05.278920074 +0000 UTC m=+0.094583137 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 08 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.287 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/533c431a-8ae8-4310-81dc-29285b78f93c/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.288 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/533c431a-8ae8-4310-81dc-29285b78f93c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:06:05 compute-0 ovn_controller[19759]: 2025-10-08T19:06:05Z|00033|binding|INFO|Releasing lport f9878aab-28ef-456a-a43a-7cacc2381b1f from this chassis (sb_readonly=0)
Oct 08 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.301 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.340 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/533c431a-8ae8-4310-81dc-29285b78f93c/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.466 2 DEBUG nova.compute.manager [req-90b99b33-b76a-4c31-82fc-5c145b98c9c6 req-ea303629-57ee-47d1-a093-2509ad8a9aba bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Received event network-changed-82f4743a-dcdc-49f7-be61-94d565e29842 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.467 2 DEBUG nova.compute.manager [req-90b99b33-b76a-4c31-82fc-5c145b98c9c6 req-ea303629-57ee-47d1-a093-2509ad8a9aba bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Refreshing instance network info cache due to event network-changed-82f4743a-dcdc-49f7-be61-94d565e29842. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 08 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.468 2 DEBUG oslo_concurrency.lockutils [req-90b99b33-b76a-4c31-82fc-5c145b98c9c6 req-ea303629-57ee-47d1-a093-2509ad8a9aba bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-533c431a-8ae8-4310-81dc-29285b78f93c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.468 2 DEBUG oslo_concurrency.lockutils [req-90b99b33-b76a-4c31-82fc-5c145b98c9c6 req-ea303629-57ee-47d1-a093-2509ad8a9aba bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-533c431a-8ae8-4310-81dc-29285b78f93c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.469 2 DEBUG nova.network.neutron [req-90b99b33-b76a-4c31-82fc-5c145b98c9c6 req-ea303629-57ee-47d1-a093-2509ad8a9aba bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Refreshing network info cache for port 82f4743a-dcdc-49f7-be61-94d565e29842 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 08 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.525 2 WARNING nova.virt.libvirt.driver [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 08 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.526 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5946MB free_disk=73.42291259765625GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 08 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.527 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.527 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.664 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Instance 533c431a-8ae8-4310-81dc-29285b78f93c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 08 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.666 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 08 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.666 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 08 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.723 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Refreshing inventories for resource provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Oct 08 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.769 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Updating ProviderTree inventory for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Oct 08 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.770 2 DEBUG nova.compute.provider_tree [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Updating inventory in ProviderTree for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Oct 08 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.790 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Refreshing aggregate associations for resource provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Oct 08 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.813 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Refreshing trait associations for resource provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349, traits: COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_FDC,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE2,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_MMX,HW_CPU_X86_AESNI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_ACCELERATORS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE42,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_AVX,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_FMA3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SVM,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NODE,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSSE3,HW_CPU_X86_AVX2,HW_CPU_X86_SSE,HW_CPU_X86_CLMUL,HW_CPU_X86_F16C,HW_CPU_X86_AMD_SVM,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_BMI,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_PCNET _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Oct 08 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.878 2 DEBUG nova.compute.provider_tree [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Updating inventory in ProviderTree for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Oct 08 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.920 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Updated inventory for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Oct 08 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.924 2 DEBUG nova.compute.provider_tree [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Updating resource provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Oct 08 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.924 2 DEBUG nova.compute.provider_tree [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Updating inventory in ProviderTree for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Oct 08 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.956 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 08 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.956 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.429s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:06:06 compute-0 nova_compute[117514]: 2025-10-08 19:06:06.515 2 DEBUG nova.network.neutron [req-90b99b33-b76a-4c31-82fc-5c145b98c9c6 req-ea303629-57ee-47d1-a093-2509ad8a9aba bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Updated VIF entry in instance network info cache for port 82f4743a-dcdc-49f7-be61-94d565e29842. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 08 19:06:06 compute-0 nova_compute[117514]: 2025-10-08 19:06:06.516 2 DEBUG nova.network.neutron [req-90b99b33-b76a-4c31-82fc-5c145b98c9c6 req-ea303629-57ee-47d1-a093-2509ad8a9aba bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Updating instance_info_cache with network_info: [{"id": "82f4743a-dcdc-49f7-be61-94d565e29842", "address": "fa:16:3e:2e:6b:6c", "network": {"id": "a913b285-6d0a-478e-aa24-18bb458d8f7a", "bridge": "br-int", "label": "tempest-network-smoke--994528417", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82f4743a-dc", "ovs_interfaceid": "82f4743a-dcdc-49f7-be61-94d565e29842", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:06:06 compute-0 nova_compute[117514]: 2025-10-08 19:06:06.535 2 DEBUG oslo_concurrency.lockutils [req-90b99b33-b76a-4c31-82fc-5c145b98c9c6 req-ea303629-57ee-47d1-a093-2509ad8a9aba bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-533c431a-8ae8-4310-81dc-29285b78f93c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:06:07 compute-0 nova_compute[117514]: 2025-10-08 19:06:07.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.592 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}151d582867e0a6380c4d0d029ac59b6d50f43a9bf2fdcced1ca2054ddb79aeff" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.662 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 644 Content-Type: application/json Date: Wed, 08 Oct 2025 19:06:08 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-175e4ff3-0db0-45f6-9e8e-4b14d7e7d211 x-openstack-request-id: req-175e4ff3-0db0-45f6-9e8e-4b14d7e7d211 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.662 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "0e642ddb-c06b-4314-8c06-76ae32c14bd7", "name": "m1.micro", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/0e642ddb-c06b-4314-8c06-76ae32c14bd7"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/0e642ddb-c06b-4314-8c06-76ae32c14bd7"}]}, {"id": "e8a148fc-4419-4813-98ff-a17e2a95609e", "name": "m1.nano", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/e8a148fc-4419-4813-98ff-a17e2a95609e"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/e8a148fc-4419-4813-98ff-a17e2a95609e"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.662 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-175e4ff3-0db0-45f6-9e8e-4b14d7e7d211 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.664 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors/e8a148fc-4419-4813-98ff-a17e2a95609e -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}151d582867e0a6380c4d0d029ac59b6d50f43a9bf2fdcced1ca2054ddb79aeff" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.717 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 495 Content-Type: application/json Date: Wed, 08 Oct 2025 19:06:08 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-ab2f69df-ec97-439d-8d8a-9211b5334b9c x-openstack-request-id: req-ab2f69df-ec97-439d-8d8a-9211b5334b9c _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.717 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "e8a148fc-4419-4813-98ff-a17e2a95609e", "name": "m1.nano", "ram": 128, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 0, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/e8a148fc-4419-4813-98ff-a17e2a95609e"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/e8a148fc-4419-4813-98ff-a17e2a95609e"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.717 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors/e8a148fc-4419-4813-98ff-a17e2a95609e used request id req-ab2f69df-ec97-439d-8d8a-9211b5334b9c request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.719 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'name': 'tempest-TestNetworkBasicOps-server-447228763', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000001', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'hostId': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.719 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.719 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.719 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-447228763>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-447228763>]
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.721 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.721 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.721 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-447228763>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-447228763>]
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.721 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.724 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 533c431a-8ae8-4310-81dc-29285b78f93c / tap82f4743a-dc inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.724 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '35907205-2001-4298-b707-b7c15bfc8a4d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-00000001-533c431a-8ae8-4310-81dc-29285b78f93c-tap82f4743a-dc', 'timestamp': '2025-10-08T19:06:08.722071', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'tap82f4743a-dc', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2e:6b:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap82f4743a-dc'}, 'message_id': 'd901c0d6-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.351413058, 'message_signature': '814dcd68873ab01e1c8e1a3c4e1eb53c88bdd2bfc5c9a292bc08991897fe2428'}]}, 'timestamp': '2025-10-08 19:06:08.725746', '_unique_id': '3df6988ba98d47dd86a91eee91cb2b1a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.745 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.756 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.756 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2132c7bf-c3db-4001-874b-20b74769015f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '533c431a-8ae8-4310-81dc-29285b78f93c-vda', 'timestamp': '2025-10-08T19:06:08.745917', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'instance-00000001', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd9068698-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.375248515, 'message_signature': '485e117a1b94a71ef7761fe483f3f663b00fb582d0601b665bda645a7603bb5a'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '533c431a-8ae8-4310-81dc-29285b78f93c-sda', 'timestamp': '2025-10-08T19:06:08.745917', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'instance-00000001', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd9069700-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.375248515, 'message_signature': '6ecc4cc1becb660186a0f71bf1d43249562cc9002cd33ef7e5bfce8ffa44d0db'}]}, 'timestamp': '2025-10-08 19:06:08.757271', '_unique_id': '7f4736544fec4b9981d41f908e33d284'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.759 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.780 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.781 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f810ccde-87fc-49f2-9ee2-be573484263e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '533c431a-8ae8-4310-81dc-29285b78f93c-vda', 'timestamp': '2025-10-08T19:06:08.759323', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'instance-00000001', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd90a3a40-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.38863171, 'message_signature': '91541f0b9023891debd66cf2b872c7d16c3f815ad3dccacd0672659312253e8c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '533c431a-8ae8-4310-81dc-29285b78f93c-sda', 'timestamp': '2025-10-08T19:06:08.759323', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'instance-00000001', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd90a49c2-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.38863171, 'message_signature': '5afca42953fd269c595897721dcfb47a5d6907f6a5d5ad022954cbf525a35663'}]}, 'timestamp': '2025-10-08 19:06:08.781507', '_unique_id': '10912d924bbf4eb396e077f15e20a4d9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.783 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.783 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/disk.device.read.latency volume: 515933499 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.783 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/disk.device.read.latency volume: 2839399 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1d9c83e1-911f-4754-a85f-50bea97b7f01', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 515933499, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '533c431a-8ae8-4310-81dc-29285b78f93c-vda', 'timestamp': '2025-10-08T19:06:08.783477', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'instance-00000001', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd90aa322-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.38863171, 'message_signature': 'f3dbc3eec616903c6e6872994785cab032060c184cd3eb3a9b4b1c65db8bf6df'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2839399, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '533c431a-8ae8-4310-81dc-29285b78f93c-sda', 'timestamp': '2025-10-08T19:06:08.783477', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'instance-00000001', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd90aaf8e-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.38863171, 'message_signature': 'f31722044e2ef5eee98573313da3f03911db9834120b245e00e82a9c70668822'}]}, 'timestamp': '2025-10-08 19:06:08.784104', '_unique_id': '67fdfb1addc3410aa064363c638d7316'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.785 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.785 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '79590750-27de-497c-bf98-07408ab21177', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-00000001-533c431a-8ae8-4310-81dc-29285b78f93c-tap82f4743a-dc', 'timestamp': '2025-10-08T19:06:08.785776', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'tap82f4743a-dc', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2e:6b:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap82f4743a-dc'}, 'message_id': 'd90afeb2-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.351413058, 'message_signature': 'a05d8b2a1648829827856429737c34c32db6c705d675d9db52170f9b4e11c57a'}]}, 'timestamp': '2025-10-08 19:06:08.786165', '_unique_id': '95af96018ac9476086b208122da5511d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.787 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.787 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/disk.device.read.bytes volume: 23816192 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.788 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '17849020-0a83-4f5a-86c9-2960de6e9b80', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23816192, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '533c431a-8ae8-4310-81dc-29285b78f93c-vda', 'timestamp': '2025-10-08T19:06:08.787810', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'instance-00000001', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd90b4f70-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.38863171, 'message_signature': 'f7c86bb0cceec6358c9a6bb864979b95246c3d6c38380c9d66071d9532454c3b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '533c431a-8ae8-4310-81dc-29285b78f93c-sda', 'timestamp': '2025-10-08T19:06:08.787810', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'instance-00000001', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd90b5cb8-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.38863171, 'message_signature': '4942e84989de3b61b0c113c3e33b33e7a6068cef98fba1ba5695a9f416511a81'}]}, 'timestamp': '2025-10-08 19:06:08.788542', '_unique_id': '71f33555e38d4a0bb97fe09ac0ffaa06'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.790 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.790 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '582234b2-548b-4909-9b17-093335ee7e5e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-00000001-533c431a-8ae8-4310-81dc-29285b78f93c-tap82f4743a-dc', 'timestamp': '2025-10-08T19:06:08.790401', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'tap82f4743a-dc', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2e:6b:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap82f4743a-dc'}, 'message_id': 'd90bb2d0-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.351413058, 'message_signature': '4711387174c74737c60eeecbc3af46c5c0dfe110f4cdca85da8cc4403f022b8e'}]}, 'timestamp': '2025-10-08 19:06:08.790766', '_unique_id': 'd4b67bd24ebe426bb00a8d22525f6fa3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.792 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.792 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9f508fd0-f7c3-439e-95c2-150604a99a95', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-00000001-533c431a-8ae8-4310-81dc-29285b78f93c-tap82f4743a-dc', 'timestamp': '2025-10-08T19:06:08.792403', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'tap82f4743a-dc', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2e:6b:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap82f4743a-dc'}, 'message_id': 'd90c01c2-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.351413058, 'message_signature': '3b375ef2e2382ed9665fa77c67875eaeadf7e6949d4d910b043e18a9d769ffe5'}]}, 'timestamp': '2025-10-08 19:06:08.792826', '_unique_id': '012fc97dabd14fd199a4f622aff92514'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.794 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.794 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '802f9d92-874a-479d-b896-b9e3177c04d9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-00000001-533c431a-8ae8-4310-81dc-29285b78f93c-tap82f4743a-dc', 'timestamp': '2025-10-08T19:06:08.794612', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'tap82f4743a-dc', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2e:6b:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap82f4743a-dc'}, 'message_id': 'd90c55f0-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.351413058, 'message_signature': 'd2f46a9f3708a1889b8c680bf39dfa7e70d670e97ee86b1a8ac4d427fe29759f'}]}, 'timestamp': '2025-10-08 19:06:08.794958', '_unique_id': '7f7a852476244d32be286498f63b90a6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.796 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.796 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7ab75317-c722-4360-a33b-fef661f0a0d9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-00000001-533c431a-8ae8-4310-81dc-29285b78f93c-tap82f4743a-dc', 'timestamp': '2025-10-08T19:06:08.796468', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'tap82f4743a-dc', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2e:6b:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap82f4743a-dc'}, 'message_id': 'd90c9e52-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.351413058, 'message_signature': 'b7f5a3a199d1274675072338663f2d4c5f6a4d0f4d83aa1c7287a4195ceb5f0b'}]}, 'timestamp': '2025-10-08 19:06:08.796780', '_unique_id': 'aa6b59e9f0eb449cac8fdbdd6bdc2d8a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.798 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.798 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.798 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-447228763>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-447228763>]
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.798 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.798 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '36864609-ae67-4e37-8cc4-34dc4c01d61c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-00000001-533c431a-8ae8-4310-81dc-29285b78f93c-tap82f4743a-dc', 'timestamp': '2025-10-08T19:06:08.798896', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'tap82f4743a-dc', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2e:6b:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap82f4743a-dc'}, 'message_id': 'd90cfd48-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.351413058, 'message_signature': 'ad5eb97330b664aae4012415f1ac95c9d92817386fa3aa9100526665e5091283'}]}, 'timestamp': '2025-10-08 19:06:08.799222', '_unique_id': 'ad00028cc2d9402984125e0ea6ef5c5a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.800 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.801 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.801 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-447228763>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-447228763>]
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.801 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.818 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/cpu volume: 9340000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e3f4157d-bf78-4ce5-b3fb-c457622aa95f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9340000000, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'timestamp': '2025-10-08T19:06:08.801403', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'instance-00000001', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'd90ff836-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.447065844, 'message_signature': 'a975082fcbfba5115fb76843583f1cc24d35464ebbec455a8c631a00489f694c'}]}, 'timestamp': '2025-10-08 19:06:08.818815', '_unique_id': '938cb870f7ba469b815c1a55aac58051'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.821 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.821 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '802bd66e-c197-451a-8b75-e8836fadce24', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-00000001-533c431a-8ae8-4310-81dc-29285b78f93c-tap82f4743a-dc', 'timestamp': '2025-10-08T19:06:08.821129', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'tap82f4743a-dc', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2e:6b:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap82f4743a-dc'}, 'message_id': 'd9106258-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.351413058, 'message_signature': '7d4dacd3fb4947b3244105af2d95b57145ff52656f4373da2e5f5f2d8dd2275c'}]}, 'timestamp': '2025-10-08 19:06:08.821511', '_unique_id': '8c247212d1b74453ab468e38ab318b79'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.823 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.823 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.823 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '84afa966-8d57-4fb6-b0d6-4dcc6b3fbcf7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '533c431a-8ae8-4310-81dc-29285b78f93c-vda', 'timestamp': '2025-10-08T19:06:08.823353', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'instance-00000001', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd910b99c-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.38863171, 'message_signature': 'bb952d22a29a99e42c486397e1a76a0eb2298e0268ac8622c6edaa101f1fb412'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '533c431a-8ae8-4310-81dc-29285b78f93c-sda', 'timestamp': '2025-10-08T19:06:08.823353', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'instance-00000001', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd910c4be-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.38863171, 'message_signature': '61c6664613ca96edf172e080fc6eceaa737d6173da56c3d82c549ce63468b56d'}]}, 'timestamp': '2025-10-08 19:06:08.824008', '_unique_id': '3fffd89452ff4866bbb5ebacd1473578'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.825 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.825 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.826 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '48d41bb4-f218-473f-bcaf-5101e7392a41', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '533c431a-8ae8-4310-81dc-29285b78f93c-vda', 'timestamp': '2025-10-08T19:06:08.825802', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'instance-00000001', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd9111b08-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.38863171, 'message_signature': '622529d7be731e9f047a489dfe2cf06b8c9fb498bf62b0a597ffc4a43e82a8dd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '533c431a-8ae8-4310-81dc-29285b78f93c-sda', 'timestamp': '2025-10-08T19:06:08.825802', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'instance-00000001', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd911286e-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.38863171, 'message_signature': 'c9fa407418210e98d3f3402b72d383972a050fb1a64ee09176b2f3655e9db99c'}]}, 'timestamp': '2025-10-08 19:06:08.826519', '_unique_id': '56e481ee21544516bdc6775327afbd0c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.828 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.828 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/network.incoming.bytes volume: 110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c2fa4bb0-9907-4761-b285-709c2e25b9d1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 110, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-00000001-533c431a-8ae8-4310-81dc-29285b78f93c-tap82f4743a-dc', 'timestamp': '2025-10-08T19:06:08.828369', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'tap82f4743a-dc', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2e:6b:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap82f4743a-dc'}, 'message_id': 'd9117e86-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.351413058, 'message_signature': '65b1907fd40c52876db58cd4d145605f304b77a40ef0a42d64831927f6305a9f'}]}, 'timestamp': '2025-10-08 19:06:08.828774', '_unique_id': 'f72eb765c67f4120b11bdf607c123840'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.830 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.830 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9cfe6867-8cf2-4b99-808a-06a2b88b4807', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-00000001-533c431a-8ae8-4310-81dc-29285b78f93c-tap82f4743a-dc', 'timestamp': '2025-10-08T19:06:08.830536', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'tap82f4743a-dc', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2e:6b:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap82f4743a-dc'}, 'message_id': 'd911d156-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.351413058, 'message_signature': 'c99aaf0f364bf07a400672550f31d61a9d8970469b104864c2c07699681a8149'}]}, 'timestamp': '2025-10-08 19:06:08.830882', '_unique_id': '784d032244604e1b8b17de2cf3dc3339'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.832 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.832 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.832 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e4b3963e-9bba-41a2-9917-e1d1c6ecf47a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '533c431a-8ae8-4310-81dc-29285b78f93c-vda', 'timestamp': '2025-10-08T19:06:08.832429', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'instance-00000001', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd9121b34-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.375248515, 'message_signature': 'f05cd9e5a3c9df1b818769037658a10f3e3c7cd2f07b9983f3a362c2d1eec8aa'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '533c431a-8ae8-4310-81dc-29285b78f93c-sda', 'timestamp': '2025-10-08T19:06:08.832429', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'instance-00000001', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd91227fa-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.375248515, 'message_signature': '675904e550859f9139f8817b21f244c47655534340fd0408280571c30d2db36b'}]}, 'timestamp': '2025-10-08 19:06:08.833056', '_unique_id': '732ee80b83fc4baca367eb12ee86e1b9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.834 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.834 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/memory.usage volume: 40.3671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '77a9ef39-cc60-4708-a906-b38c87ef002e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.3671875, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'timestamp': '2025-10-08T19:06:08.834662', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'instance-00000001', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'd9127520-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.447065844, 'message_signature': '886d17b0d1d51e2945633ada53176f67cd54da51f803c1eb9b0c828044b67747'}]}, 'timestamp': '2025-10-08 19:06:08.835057', '_unique_id': 'd4a942374c93446094d6e6046b5c8f35'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.836 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.837 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.837 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7686e3b6-0e94-4eb1-804d-478f989c60b7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '533c431a-8ae8-4310-81dc-29285b78f93c-vda', 'timestamp': '2025-10-08T19:06:08.837057', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'instance-00000001', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd912d196-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.375248515, 'message_signature': '7189b095546912ec9fd682df85ec703f69f371680def042d7cdb768cc8a4bcfc'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '533c431a-8ae8-4310-81dc-29285b78f93c-sda', 'timestamp': '2025-10-08T19:06:08.837057', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'instance-00000001', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd912e0f0-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.375248515, 'message_signature': '748cca2cc3b444f4437be6a8baa8c9a84939772b7adc4f536820bf1bac1107c8'}]}, 'timestamp': '2025-10-08 19:06:08.837834', '_unique_id': 'a84a206ebacf421a9565e4c718cb7824'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.839 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.839 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/disk.device.read.requests volume: 770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.839 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9f6a2dfd-8fa9-4050-ba35-1f77367cf81c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 770, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '533c431a-8ae8-4310-81dc-29285b78f93c-vda', 'timestamp': '2025-10-08T19:06:08.839522', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'instance-00000001', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd913305a-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.38863171, 'message_signature': '04c5cfb6da0cac461baaf555918673a8740f269195ae4298a2b0def01cbb7c36'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '533c431a-8ae8-4310-81dc-29285b78f93c-sda', 'timestamp': '2025-10-08T19:06:08.839522', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'instance-00000001', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd9133f14-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.38863171, 'message_signature': 'a408eee8ea995999312433ccfa8bc84cba23b2e280c561ded213ac73830129a6'}]}, 'timestamp': '2025-10-08 19:06:08.840252', '_unique_id': 'b79882dccee5413c99f63ee4bbf57122'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:06:10 compute-0 nova_compute[117514]: 2025-10-08 19:06:10.309 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:06:11 compute-0 ovn_controller[19759]: 2025-10-08T19:06:11Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:2e:6b:6c 10.100.0.3
Oct 08 19:06:11 compute-0 ovn_controller[19759]: 2025-10-08T19:06:11Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:2e:6b:6c 10.100.0.3
Oct 08 19:06:12 compute-0 nova_compute[117514]: 2025-10-08 19:06:12.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:06:13 compute-0 podman[144886]: 2025-10-08 19:06:13.656084282 +0000 UTC m=+0.078392694 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct 08 19:06:15 compute-0 nova_compute[117514]: 2025-10-08 19:06:15.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:06:17 compute-0 nova_compute[117514]: 2025-10-08 19:06:17.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:06:18 compute-0 nova_compute[117514]: 2025-10-08 19:06:18.034 2 INFO nova.compute.manager [None req-ba8ce6c8-3274-4373-9943-af7f439b2e3b efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Get console output
Oct 08 19:06:18 compute-0 nova_compute[117514]: 2025-10-08 19:06:18.161 54 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Oct 08 19:06:19 compute-0 podman[144907]: 2025-10-08 19:06:19.674103045 +0000 UTC m=+0.073486097 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 08 19:06:19 compute-0 podman[144906]: 2025-10-08 19:06:19.706752818 +0000 UTC m=+0.115147002 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, release=1755695350, vendor=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, name=ubi9-minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, distribution-scope=public, maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct 08 19:06:20 compute-0 nova_compute[117514]: 2025-10-08 19:06:20.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:06:22 compute-0 nova_compute[117514]: 2025-10-08 19:06:22.376 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:06:23 compute-0 podman[144945]: 2025-10-08 19:06:23.66632635 +0000 UTC m=+0.081844779 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 08 19:06:25 compute-0 nova_compute[117514]: 2025-10-08 19:06:25.351 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:06:25 compute-0 podman[144969]: 2025-10-08 19:06:25.6337282 +0000 UTC m=+0.052210339 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 08 19:06:25 compute-0 podman[144970]: 2025-10-08 19:06:25.664769491 +0000 UTC m=+0.066769167 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 08 19:06:27 compute-0 nova_compute[117514]: 2025-10-08 19:06:27.379 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:06:27 compute-0 podman[145008]: 2025-10-08 19:06:27.683151723 +0000 UTC m=+0.110399189 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 08 19:06:29 compute-0 nova_compute[117514]: 2025-10-08 19:06:29.394 2 DEBUG oslo_concurrency.lockutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "432f298f-78dd-4e9e-9ee4-279c2bc544c1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:06:29 compute-0 nova_compute[117514]: 2025-10-08 19:06:29.395 2 DEBUG oslo_concurrency.lockutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "432f298f-78dd-4e9e-9ee4-279c2bc544c1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:06:29 compute-0 nova_compute[117514]: 2025-10-08 19:06:29.412 2 DEBUG nova.compute.manager [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Oct 08 19:06:29 compute-0 nova_compute[117514]: 2025-10-08 19:06:29.496 2 DEBUG oslo_concurrency.lockutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:06:29 compute-0 nova_compute[117514]: 2025-10-08 19:06:29.497 2 DEBUG oslo_concurrency.lockutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:06:29 compute-0 nova_compute[117514]: 2025-10-08 19:06:29.506 2 DEBUG nova.virt.hardware [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Oct 08 19:06:29 compute-0 nova_compute[117514]: 2025-10-08 19:06:29.506 2 INFO nova.compute.claims [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Claim successful on node compute-0.ctlplane.example.com
Oct 08 19:06:29 compute-0 nova_compute[117514]: 2025-10-08 19:06:29.625 2 DEBUG nova.compute.provider_tree [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 08 19:06:29 compute-0 nova_compute[117514]: 2025-10-08 19:06:29.650 2 DEBUG nova.scheduler.client.report [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 08 19:06:29 compute-0 nova_compute[117514]: 2025-10-08 19:06:29.687 2 DEBUG oslo_concurrency.lockutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.190s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:06:29 compute-0 nova_compute[117514]: 2025-10-08 19:06:29.688 2 DEBUG nova.compute.manager [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Oct 08 19:06:29 compute-0 nova_compute[117514]: 2025-10-08 19:06:29.738 2 DEBUG nova.compute.manager [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Oct 08 19:06:29 compute-0 nova_compute[117514]: 2025-10-08 19:06:29.739 2 DEBUG nova.network.neutron [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Oct 08 19:06:29 compute-0 nova_compute[117514]: 2025-10-08 19:06:29.761 2 INFO nova.virt.libvirt.driver [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 08 19:06:29 compute-0 nova_compute[117514]: 2025-10-08 19:06:29.778 2 DEBUG nova.compute.manager [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Oct 08 19:06:29 compute-0 nova_compute[117514]: 2025-10-08 19:06:29.876 2 DEBUG nova.compute.manager [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Oct 08 19:06:29 compute-0 nova_compute[117514]: 2025-10-08 19:06:29.878 2 DEBUG nova.virt.libvirt.driver [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Oct 08 19:06:29 compute-0 nova_compute[117514]: 2025-10-08 19:06:29.879 2 INFO nova.virt.libvirt.driver [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Creating image(s)
Oct 08 19:06:29 compute-0 nova_compute[117514]: 2025-10-08 19:06:29.880 2 DEBUG oslo_concurrency.lockutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "/var/lib/nova/instances/432f298f-78dd-4e9e-9ee4-279c2bc544c1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:06:29 compute-0 nova_compute[117514]: 2025-10-08 19:06:29.880 2 DEBUG oslo_concurrency.lockutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "/var/lib/nova/instances/432f298f-78dd-4e9e-9ee4-279c2bc544c1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:06:29 compute-0 nova_compute[117514]: 2025-10-08 19:06:29.881 2 DEBUG oslo_concurrency.lockutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "/var/lib/nova/instances/432f298f-78dd-4e9e-9ee4-279c2bc544c1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:06:29 compute-0 nova_compute[117514]: 2025-10-08 19:06:29.904 2 DEBUG oslo_concurrency.processutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:06:29 compute-0 nova_compute[117514]: 2025-10-08 19:06:29.995 2 DEBUG oslo_concurrency.processutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:06:29 compute-0 nova_compute[117514]: 2025-10-08 19:06:29.996 2 DEBUG oslo_concurrency.lockutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "008eb3078b811ee47058b7252a820910c35fc6df" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:06:29 compute-0 nova_compute[117514]: 2025-10-08 19:06:29.997 2 DEBUG oslo_concurrency.lockutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:06:30 compute-0 nova_compute[117514]: 2025-10-08 19:06:30.012 2 DEBUG oslo_concurrency.processutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:06:30 compute-0 nova_compute[117514]: 2025-10-08 19:06:30.035 2 DEBUG nova.policy [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Oct 08 19:06:30 compute-0 nova_compute[117514]: 2025-10-08 19:06:30.081 2 DEBUG oslo_concurrency.processutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:06:30 compute-0 nova_compute[117514]: 2025-10-08 19:06:30.082 2 DEBUG oslo_concurrency.processutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df,backing_fmt=raw /var/lib/nova/instances/432f298f-78dd-4e9e-9ee4-279c2bc544c1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:06:30 compute-0 nova_compute[117514]: 2025-10-08 19:06:30.127 2 DEBUG oslo_concurrency.processutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df,backing_fmt=raw /var/lib/nova/instances/432f298f-78dd-4e9e-9ee4-279c2bc544c1/disk 1073741824" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:06:30 compute-0 nova_compute[117514]: 2025-10-08 19:06:30.131 2 DEBUG oslo_concurrency.lockutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:06:30 compute-0 nova_compute[117514]: 2025-10-08 19:06:30.132 2 DEBUG oslo_concurrency.processutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:06:30 compute-0 nova_compute[117514]: 2025-10-08 19:06:30.216 2 DEBUG oslo_concurrency.processutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:06:30 compute-0 nova_compute[117514]: 2025-10-08 19:06:30.218 2 DEBUG nova.virt.disk.api [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Checking if we can resize image /var/lib/nova/instances/432f298f-78dd-4e9e-9ee4-279c2bc544c1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Oct 08 19:06:30 compute-0 nova_compute[117514]: 2025-10-08 19:06:30.219 2 DEBUG oslo_concurrency.processutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/432f298f-78dd-4e9e-9ee4-279c2bc544c1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:06:30 compute-0 nova_compute[117514]: 2025-10-08 19:06:30.311 2 DEBUG oslo_concurrency.processutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/432f298f-78dd-4e9e-9ee4-279c2bc544c1/disk --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:06:30 compute-0 nova_compute[117514]: 2025-10-08 19:06:30.313 2 DEBUG nova.virt.disk.api [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Cannot resize image /var/lib/nova/instances/432f298f-78dd-4e9e-9ee4-279c2bc544c1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Oct 08 19:06:30 compute-0 nova_compute[117514]: 2025-10-08 19:06:30.313 2 DEBUG nova.objects.instance [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'migration_context' on Instance uuid 432f298f-78dd-4e9e-9ee4-279c2bc544c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 08 19:06:30 compute-0 nova_compute[117514]: 2025-10-08 19:06:30.330 2 DEBUG nova.virt.libvirt.driver [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Oct 08 19:06:30 compute-0 nova_compute[117514]: 2025-10-08 19:06:30.330 2 DEBUG nova.virt.libvirt.driver [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Ensure instance console log exists: /var/lib/nova/instances/432f298f-78dd-4e9e-9ee4-279c2bc544c1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Oct 08 19:06:30 compute-0 nova_compute[117514]: 2025-10-08 19:06:30.331 2 DEBUG oslo_concurrency.lockutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:06:30 compute-0 nova_compute[117514]: 2025-10-08 19:06:30.332 2 DEBUG oslo_concurrency.lockutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:06:30 compute-0 nova_compute[117514]: 2025-10-08 19:06:30.333 2 DEBUG oslo_concurrency.lockutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:06:30 compute-0 nova_compute[117514]: 2025-10-08 19:06:30.354 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:06:30 compute-0 nova_compute[117514]: 2025-10-08 19:06:30.607 2 DEBUG nova.network.neutron [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Successfully created port: 41ab28a1-9254-46a6-97fc-2220fe30eccd _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Oct 08 19:06:31 compute-0 nova_compute[117514]: 2025-10-08 19:06:31.555 2 DEBUG nova.network.neutron [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Successfully updated port: 41ab28a1-9254-46a6-97fc-2220fe30eccd _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Oct 08 19:06:31 compute-0 nova_compute[117514]: 2025-10-08 19:06:31.573 2 DEBUG oslo_concurrency.lockutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "refresh_cache-432f298f-78dd-4e9e-9ee4-279c2bc544c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:06:31 compute-0 nova_compute[117514]: 2025-10-08 19:06:31.573 2 DEBUG oslo_concurrency.lockutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquired lock "refresh_cache-432f298f-78dd-4e9e-9ee4-279c2bc544c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:06:31 compute-0 nova_compute[117514]: 2025-10-08 19:06:31.574 2 DEBUG nova.network.neutron [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 08 19:06:31 compute-0 nova_compute[117514]: 2025-10-08 19:06:31.635 2 DEBUG nova.compute.manager [req-cb7578e8-60fb-4c89-b616-ba692ab08795 req-da88d31b-49e2-4ee5-b7f5-8e9a8d101196 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Received event network-changed-41ab28a1-9254-46a6-97fc-2220fe30eccd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:06:31 compute-0 nova_compute[117514]: 2025-10-08 19:06:31.635 2 DEBUG nova.compute.manager [req-cb7578e8-60fb-4c89-b616-ba692ab08795 req-da88d31b-49e2-4ee5-b7f5-8e9a8d101196 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Refreshing instance network info cache due to event network-changed-41ab28a1-9254-46a6-97fc-2220fe30eccd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 08 19:06:31 compute-0 nova_compute[117514]: 2025-10-08 19:06:31.636 2 DEBUG oslo_concurrency.lockutils [req-cb7578e8-60fb-4c89-b616-ba692ab08795 req-da88d31b-49e2-4ee5-b7f5-8e9a8d101196 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-432f298f-78dd-4e9e-9ee4-279c2bc544c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:06:31 compute-0 nova_compute[117514]: 2025-10-08 19:06:31.696 2 DEBUG nova.network.neutron [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Oct 08 19:06:32 compute-0 nova_compute[117514]: 2025-10-08 19:06:32.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.043 2 DEBUG nova.network.neutron [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Updating instance_info_cache with network_info: [{"id": "41ab28a1-9254-46a6-97fc-2220fe30eccd", "address": "fa:16:3e:a4:ab:8d", "network": {"id": "3f19211d-1888-42c2-a8ff-1de7bc4f9219", "bridge": "br-int", "label": "tempest-network-smoke--1824442985", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41ab28a1-92", "ovs_interfaceid": "41ab28a1-9254-46a6-97fc-2220fe30eccd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.061 2 DEBUG oslo_concurrency.lockutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Releasing lock "refresh_cache-432f298f-78dd-4e9e-9ee4-279c2bc544c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.062 2 DEBUG nova.compute.manager [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Instance network_info: |[{"id": "41ab28a1-9254-46a6-97fc-2220fe30eccd", "address": "fa:16:3e:a4:ab:8d", "network": {"id": "3f19211d-1888-42c2-a8ff-1de7bc4f9219", "bridge": "br-int", "label": "tempest-network-smoke--1824442985", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41ab28a1-92", "ovs_interfaceid": "41ab28a1-9254-46a6-97fc-2220fe30eccd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Oct 08 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.062 2 DEBUG oslo_concurrency.lockutils [req-cb7578e8-60fb-4c89-b616-ba692ab08795 req-da88d31b-49e2-4ee5-b7f5-8e9a8d101196 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-432f298f-78dd-4e9e-9ee4-279c2bc544c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.063 2 DEBUG nova.network.neutron [req-cb7578e8-60fb-4c89-b616-ba692ab08795 req-da88d31b-49e2-4ee5-b7f5-8e9a8d101196 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Refreshing network info cache for port 41ab28a1-9254-46a6-97fc-2220fe30eccd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 08 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.066 2 DEBUG nova.virt.libvirt.driver [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Start _get_guest_xml network_info=[{"id": "41ab28a1-9254-46a6-97fc-2220fe30eccd", "address": "fa:16:3e:a4:ab:8d", "network": {"id": "3f19211d-1888-42c2-a8ff-1de7bc4f9219", "bridge": "br-int", "label": "tempest-network-smoke--1824442985", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41ab28a1-92", "ovs_interfaceid": "41ab28a1-9254-46a6-97fc-2220fe30eccd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T19:05:11Z,direct_url=<?>,disk_format='qcow2',id=23cfa426-7011-4566-992d-1c7af39f70dd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0776a2a010754884a7b224f3b08ef53b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T19:05:12Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'guest_format': None, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_options': None, 'image_id': '23cfa426-7011-4566-992d-1c7af39f70dd'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Oct 08 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.072 2 WARNING nova.virt.libvirt.driver [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 08 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.079 2 DEBUG nova.virt.libvirt.host [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Oct 08 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.080 2 DEBUG nova.virt.libvirt.host [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Oct 08 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.083 2 DEBUG nova.virt.libvirt.host [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Oct 08 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.084 2 DEBUG nova.virt.libvirt.host [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Oct 08 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.085 2 DEBUG nova.virt.libvirt.driver [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Oct 08 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.085 2 DEBUG nova.virt.hardware [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T19:05:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='e8a148fc-4419-4813-98ff-a17e2a95609e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T19:05:11Z,direct_url=<?>,disk_format='qcow2',id=23cfa426-7011-4566-992d-1c7af39f70dd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0776a2a010754884a7b224f3b08ef53b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T19:05:12Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Oct 08 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.085 2 DEBUG nova.virt.hardware [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Oct 08 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.086 2 DEBUG nova.virt.hardware [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Oct 08 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.086 2 DEBUG nova.virt.hardware [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Oct 08 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.086 2 DEBUG nova.virt.hardware [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Oct 08 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.086 2 DEBUG nova.virt.hardware [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Oct 08 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.087 2 DEBUG nova.virt.hardware [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Oct 08 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.087 2 DEBUG nova.virt.hardware [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Oct 08 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.087 2 DEBUG nova.virt.hardware [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Oct 08 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.088 2 DEBUG nova.virt.hardware [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Oct 08 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.088 2 DEBUG nova.virt.hardware [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Oct 08 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.092 2 DEBUG nova.virt.libvirt.vif [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T19:06:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-389250187',display_name='tempest-TestNetworkBasicOps-server-389250187',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-389250187',id=2,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAad1Pe21vvEy0SgHMu1VF6n4SO9ujKCPB0SgACJ8nvOuhW/VjCSPOSETWk3+gFjb/KHaSwvZLGtfcSFz4SkdC0dg68nGstzwyBghc627R2c2cxKu7YHJFmDoK+RJ/yIQg==',key_name='tempest-TestNetworkBasicOps-1028595893',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-e7tbigki',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T19:06:29Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=432f298f-78dd-4e9e-9ee4-279c2bc544c1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "41ab28a1-9254-46a6-97fc-2220fe30eccd", "address": "fa:16:3e:a4:ab:8d", "network": {"id": "3f19211d-1888-42c2-a8ff-1de7bc4f9219", "bridge": "br-int", "label": "tempest-network-smoke--1824442985", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41ab28a1-92", "ovs_interfaceid": "41ab28a1-9254-46a6-97fc-2220fe30eccd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Oct 08 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.092 2 DEBUG nova.network.os_vif_util [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "41ab28a1-9254-46a6-97fc-2220fe30eccd", "address": "fa:16:3e:a4:ab:8d", "network": {"id": "3f19211d-1888-42c2-a8ff-1de7bc4f9219", "bridge": "br-int", "label": "tempest-network-smoke--1824442985", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41ab28a1-92", "ovs_interfaceid": "41ab28a1-9254-46a6-97fc-2220fe30eccd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 08 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.093 2 DEBUG nova.network.os_vif_util [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a4:ab:8d,bridge_name='br-int',has_traffic_filtering=True,id=41ab28a1-9254-46a6-97fc-2220fe30eccd,network=Network(3f19211d-1888-42c2-a8ff-1de7bc4f9219),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41ab28a1-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 08 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.094 2 DEBUG nova.objects.instance [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 432f298f-78dd-4e9e-9ee4-279c2bc544c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 08 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.107 2 DEBUG nova.virt.libvirt.driver [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] End _get_guest_xml xml=<domain type="kvm">
Oct 08 19:06:35 compute-0 nova_compute[117514]:   <uuid>432f298f-78dd-4e9e-9ee4-279c2bc544c1</uuid>
Oct 08 19:06:35 compute-0 nova_compute[117514]:   <name>instance-00000002</name>
Oct 08 19:06:35 compute-0 nova_compute[117514]:   <memory>131072</memory>
Oct 08 19:06:35 compute-0 nova_compute[117514]:   <vcpu>1</vcpu>
Oct 08 19:06:35 compute-0 nova_compute[117514]:   <metadata>
Oct 08 19:06:35 compute-0 nova_compute[117514]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 08 19:06:35 compute-0 nova_compute[117514]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct 08 19:06:35 compute-0 nova_compute[117514]:       <nova:name>tempest-TestNetworkBasicOps-server-389250187</nova:name>
Oct 08 19:06:35 compute-0 nova_compute[117514]:       <nova:creationTime>2025-10-08 19:06:35</nova:creationTime>
Oct 08 19:06:35 compute-0 nova_compute[117514]:       <nova:flavor name="m1.nano">
Oct 08 19:06:35 compute-0 nova_compute[117514]:         <nova:memory>128</nova:memory>
Oct 08 19:06:35 compute-0 nova_compute[117514]:         <nova:disk>1</nova:disk>
Oct 08 19:06:35 compute-0 nova_compute[117514]:         <nova:swap>0</nova:swap>
Oct 08 19:06:35 compute-0 nova_compute[117514]:         <nova:ephemeral>0</nova:ephemeral>
Oct 08 19:06:35 compute-0 nova_compute[117514]:         <nova:vcpus>1</nova:vcpus>
Oct 08 19:06:35 compute-0 nova_compute[117514]:       </nova:flavor>
Oct 08 19:06:35 compute-0 nova_compute[117514]:       <nova:owner>
Oct 08 19:06:35 compute-0 nova_compute[117514]:         <nova:user uuid="efdb1424acdb478684cdb088b373ba05">tempest-TestNetworkBasicOps-1122149477-project-member</nova:user>
Oct 08 19:06:35 compute-0 nova_compute[117514]:         <nova:project uuid="b7f7c752a9c5498f8eda73e461895ac9">tempest-TestNetworkBasicOps-1122149477</nova:project>
Oct 08 19:06:35 compute-0 nova_compute[117514]:       </nova:owner>
Oct 08 19:06:35 compute-0 nova_compute[117514]:       <nova:root type="image" uuid="23cfa426-7011-4566-992d-1c7af39f70dd"/>
Oct 08 19:06:35 compute-0 nova_compute[117514]:       <nova:ports>
Oct 08 19:06:35 compute-0 nova_compute[117514]:         <nova:port uuid="41ab28a1-9254-46a6-97fc-2220fe30eccd">
Oct 08 19:06:35 compute-0 nova_compute[117514]:           <nova:ip type="fixed" address="10.100.0.25" ipVersion="4"/>
Oct 08 19:06:35 compute-0 nova_compute[117514]:         </nova:port>
Oct 08 19:06:35 compute-0 nova_compute[117514]:       </nova:ports>
Oct 08 19:06:35 compute-0 nova_compute[117514]:     </nova:instance>
Oct 08 19:06:35 compute-0 nova_compute[117514]:   </metadata>
Oct 08 19:06:35 compute-0 nova_compute[117514]:   <sysinfo type="smbios">
Oct 08 19:06:35 compute-0 nova_compute[117514]:     <system>
Oct 08 19:06:35 compute-0 nova_compute[117514]:       <entry name="manufacturer">RDO</entry>
Oct 08 19:06:35 compute-0 nova_compute[117514]:       <entry name="product">OpenStack Compute</entry>
Oct 08 19:06:35 compute-0 nova_compute[117514]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct 08 19:06:35 compute-0 nova_compute[117514]:       <entry name="serial">432f298f-78dd-4e9e-9ee4-279c2bc544c1</entry>
Oct 08 19:06:35 compute-0 nova_compute[117514]:       <entry name="uuid">432f298f-78dd-4e9e-9ee4-279c2bc544c1</entry>
Oct 08 19:06:35 compute-0 nova_compute[117514]:       <entry name="family">Virtual Machine</entry>
Oct 08 19:06:35 compute-0 nova_compute[117514]:     </system>
Oct 08 19:06:35 compute-0 nova_compute[117514]:   </sysinfo>
Oct 08 19:06:35 compute-0 nova_compute[117514]:   <os>
Oct 08 19:06:35 compute-0 nova_compute[117514]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 08 19:06:35 compute-0 nova_compute[117514]:     <boot dev="hd"/>
Oct 08 19:06:35 compute-0 nova_compute[117514]:     <smbios mode="sysinfo"/>
Oct 08 19:06:35 compute-0 nova_compute[117514]:   </os>
Oct 08 19:06:35 compute-0 nova_compute[117514]:   <features>
Oct 08 19:06:35 compute-0 nova_compute[117514]:     <acpi/>
Oct 08 19:06:35 compute-0 nova_compute[117514]:     <apic/>
Oct 08 19:06:35 compute-0 nova_compute[117514]:     <vmcoreinfo/>
Oct 08 19:06:35 compute-0 nova_compute[117514]:   </features>
Oct 08 19:06:35 compute-0 nova_compute[117514]:   <clock offset="utc">
Oct 08 19:06:35 compute-0 nova_compute[117514]:     <timer name="pit" tickpolicy="delay"/>
Oct 08 19:06:35 compute-0 nova_compute[117514]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 08 19:06:35 compute-0 nova_compute[117514]:     <timer name="hpet" present="no"/>
Oct 08 19:06:35 compute-0 nova_compute[117514]:   </clock>
Oct 08 19:06:35 compute-0 nova_compute[117514]:   <cpu mode="host-model" match="exact">
Oct 08 19:06:35 compute-0 nova_compute[117514]:     <topology sockets="1" cores="1" threads="1"/>
Oct 08 19:06:35 compute-0 nova_compute[117514]:   </cpu>
Oct 08 19:06:35 compute-0 nova_compute[117514]:   <devices>
Oct 08 19:06:35 compute-0 nova_compute[117514]:     <disk type="file" device="disk">
Oct 08 19:06:35 compute-0 nova_compute[117514]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 08 19:06:35 compute-0 nova_compute[117514]:       <source file="/var/lib/nova/instances/432f298f-78dd-4e9e-9ee4-279c2bc544c1/disk"/>
Oct 08 19:06:35 compute-0 nova_compute[117514]:       <target dev="vda" bus="virtio"/>
Oct 08 19:06:35 compute-0 nova_compute[117514]:     </disk>
Oct 08 19:06:35 compute-0 nova_compute[117514]:     <disk type="file" device="cdrom">
Oct 08 19:06:35 compute-0 nova_compute[117514]:       <driver name="qemu" type="raw" cache="none"/>
Oct 08 19:06:35 compute-0 nova_compute[117514]:       <source file="/var/lib/nova/instances/432f298f-78dd-4e9e-9ee4-279c2bc544c1/disk.config"/>
Oct 08 19:06:35 compute-0 nova_compute[117514]:       <target dev="sda" bus="sata"/>
Oct 08 19:06:35 compute-0 nova_compute[117514]:     </disk>
Oct 08 19:06:35 compute-0 nova_compute[117514]:     <interface type="ethernet">
Oct 08 19:06:35 compute-0 nova_compute[117514]:       <mac address="fa:16:3e:a4:ab:8d"/>
Oct 08 19:06:35 compute-0 nova_compute[117514]:       <model type="virtio"/>
Oct 08 19:06:35 compute-0 nova_compute[117514]:       <driver name="vhost" rx_queue_size="512"/>
Oct 08 19:06:35 compute-0 nova_compute[117514]:       <mtu size="1442"/>
Oct 08 19:06:35 compute-0 nova_compute[117514]:       <target dev="tap41ab28a1-92"/>
Oct 08 19:06:35 compute-0 nova_compute[117514]:     </interface>
Oct 08 19:06:35 compute-0 nova_compute[117514]:     <serial type="pty">
Oct 08 19:06:35 compute-0 nova_compute[117514]:       <log file="/var/lib/nova/instances/432f298f-78dd-4e9e-9ee4-279c2bc544c1/console.log" append="off"/>
Oct 08 19:06:35 compute-0 nova_compute[117514]:     </serial>
Oct 08 19:06:35 compute-0 nova_compute[117514]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 08 19:06:35 compute-0 nova_compute[117514]:     <video>
Oct 08 19:06:35 compute-0 nova_compute[117514]:       <model type="virtio"/>
Oct 08 19:06:35 compute-0 nova_compute[117514]:     </video>
Oct 08 19:06:35 compute-0 nova_compute[117514]:     <input type="tablet" bus="usb"/>
Oct 08 19:06:35 compute-0 nova_compute[117514]:     <rng model="virtio">
Oct 08 19:06:35 compute-0 nova_compute[117514]:       <backend model="random">/dev/urandom</backend>
Oct 08 19:06:35 compute-0 nova_compute[117514]:     </rng>
Oct 08 19:06:35 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root"/>
Oct 08 19:06:35 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:06:35 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:06:35 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:06:35 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:06:35 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:06:35 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:06:35 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:06:35 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:06:35 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:06:35 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:06:35 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:06:35 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:06:35 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:06:35 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:06:35 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:06:35 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:06:35 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:06:35 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:06:35 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:06:35 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:06:35 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:06:35 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:06:35 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:06:35 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:06:35 compute-0 nova_compute[117514]:     <controller type="usb" index="0"/>
Oct 08 19:06:35 compute-0 nova_compute[117514]:     <memballoon model="virtio">
Oct 08 19:06:35 compute-0 nova_compute[117514]:       <stats period="10"/>
Oct 08 19:06:35 compute-0 nova_compute[117514]:     </memballoon>
Oct 08 19:06:35 compute-0 nova_compute[117514]:   </devices>
Oct 08 19:06:35 compute-0 nova_compute[117514]: </domain>
Oct 08 19:06:35 compute-0 nova_compute[117514]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Oct 08 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.109 2 DEBUG nova.compute.manager [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Preparing to wait for external event network-vif-plugged-41ab28a1-9254-46a6-97fc-2220fe30eccd prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Oct 08 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.109 2 DEBUG oslo_concurrency.lockutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "432f298f-78dd-4e9e-9ee4-279c2bc544c1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.110 2 DEBUG oslo_concurrency.lockutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "432f298f-78dd-4e9e-9ee4-279c2bc544c1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.110 2 DEBUG oslo_concurrency.lockutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "432f298f-78dd-4e9e-9ee4-279c2bc544c1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.112 2 DEBUG nova.virt.libvirt.vif [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T19:06:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-389250187',display_name='tempest-TestNetworkBasicOps-server-389250187',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-389250187',id=2,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAad1Pe21vvEy0SgHMu1VF6n4SO9ujKCPB0SgACJ8nvOuhW/VjCSPOSETWk3+gFjb/KHaSwvZLGtfcSFz4SkdC0dg68nGstzwyBghc627R2c2cxKu7YHJFmDoK+RJ/yIQg==',key_name='tempest-TestNetworkBasicOps-1028595893',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-e7tbigki',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T19:06:29Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=432f298f-78dd-4e9e-9ee4-279c2bc544c1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "41ab28a1-9254-46a6-97fc-2220fe30eccd", "address": "fa:16:3e:a4:ab:8d", "network": {"id": "3f19211d-1888-42c2-a8ff-1de7bc4f9219", "bridge": "br-int", "label": "tempest-network-smoke--1824442985", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41ab28a1-92", "ovs_interfaceid": "41ab28a1-9254-46a6-97fc-2220fe30eccd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Oct 08 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.112 2 DEBUG nova.network.os_vif_util [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "41ab28a1-9254-46a6-97fc-2220fe30eccd", "address": "fa:16:3e:a4:ab:8d", "network": {"id": "3f19211d-1888-42c2-a8ff-1de7bc4f9219", "bridge": "br-int", "label": "tempest-network-smoke--1824442985", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41ab28a1-92", "ovs_interfaceid": "41ab28a1-9254-46a6-97fc-2220fe30eccd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 08 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.113 2 DEBUG nova.network.os_vif_util [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a4:ab:8d,bridge_name='br-int',has_traffic_filtering=True,id=41ab28a1-9254-46a6-97fc-2220fe30eccd,network=Network(3f19211d-1888-42c2-a8ff-1de7bc4f9219),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41ab28a1-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 08 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.114 2 DEBUG os_vif [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a4:ab:8d,bridge_name='br-int',has_traffic_filtering=True,id=41ab28a1-9254-46a6-97fc-2220fe30eccd,network=Network(3f19211d-1888-42c2-a8ff-1de7bc4f9219),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41ab28a1-92') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Oct 08 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.115 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.116 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.116 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 08 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.121 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.121 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap41ab28a1-92, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.122 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap41ab28a1-92, col_values=(('external_ids', {'iface-id': '41ab28a1-9254-46a6-97fc-2220fe30eccd', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a4:ab:8d', 'vm-uuid': '432f298f-78dd-4e9e-9ee4-279c2bc544c1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.124 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:06:35 compute-0 NetworkManager[1035]: <info>  [1759950395.1258] manager: (tap41ab28a1-92): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/29)
Oct 08 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.127 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 08 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.134 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.135 2 INFO os_vif [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a4:ab:8d,bridge_name='br-int',has_traffic_filtering=True,id=41ab28a1-9254-46a6-97fc-2220fe30eccd,network=Network(3f19211d-1888-42c2-a8ff-1de7bc4f9219),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41ab28a1-92')
Oct 08 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.199 2 DEBUG nova.virt.libvirt.driver [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 08 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.200 2 DEBUG nova.virt.libvirt.driver [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 08 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.200 2 DEBUG nova.virt.libvirt.driver [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No VIF found with MAC fa:16:3e:a4:ab:8d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Oct 08 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.201 2 INFO nova.virt.libvirt.driver [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Using config drive
Oct 08 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:06:35 compute-0 podman[145051]: 2025-10-08 19:06:35.673824007 +0000 UTC m=+0.085424003 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 08 19:06:36 compute-0 nova_compute[117514]: 2025-10-08 19:06:36.182 2 INFO nova.virt.libvirt.driver [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Creating config drive at /var/lib/nova/instances/432f298f-78dd-4e9e-9ee4-279c2bc544c1/disk.config
Oct 08 19:06:36 compute-0 nova_compute[117514]: 2025-10-08 19:06:36.186 2 DEBUG oslo_concurrency.processutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/432f298f-78dd-4e9e-9ee4-279c2bc544c1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_7kqzzqc execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:06:36 compute-0 nova_compute[117514]: 2025-10-08 19:06:36.323 2 DEBUG oslo_concurrency.processutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/432f298f-78dd-4e9e-9ee4-279c2bc544c1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_7kqzzqc" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:06:36 compute-0 NetworkManager[1035]: <info>  [1759950396.4017] manager: (tap41ab28a1-92): new Tun device (/org/freedesktop/NetworkManager/Devices/30)
Oct 08 19:06:36 compute-0 kernel: tap41ab28a1-92: entered promiscuous mode
Oct 08 19:06:36 compute-0 ovn_controller[19759]: 2025-10-08T19:06:36Z|00034|binding|INFO|Claiming lport 41ab28a1-9254-46a6-97fc-2220fe30eccd for this chassis.
Oct 08 19:06:36 compute-0 ovn_controller[19759]: 2025-10-08T19:06:36Z|00035|binding|INFO|41ab28a1-9254-46a6-97fc-2220fe30eccd: Claiming fa:16:3e:a4:ab:8d 10.100.0.25
Oct 08 19:06:36 compute-0 nova_compute[117514]: 2025-10-08 19:06:36.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.421 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a4:ab:8d 10.100.0.25'], port_security=['fa:16:3e:a4:ab:8d 10.100.0.25'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.25/28', 'neutron:device_id': '432f298f-78dd-4e9e-9ee4-279c2bc544c1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3f19211d-1888-42c2-a8ff-1de7bc4f9219', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b1b038d5-57f5-4b2c-9de0-90d7e6862c10', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=80124168-1b37-4a7c-9765-130e2be44549, chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=41ab28a1-9254-46a6-97fc-2220fe30eccd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.423 28643 INFO neutron.agent.ovn.metadata.agent [-] Port 41ab28a1-9254-46a6-97fc-2220fe30eccd in datapath 3f19211d-1888-42c2-a8ff-1de7bc4f9219 bound to our chassis
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.424 28643 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3f19211d-1888-42c2-a8ff-1de7bc4f9219
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.440 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[88396f44-3558-4413-a7a6-76b6a51ca2bc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.441 28643 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3f19211d-11 in ovnmeta-3f19211d-1888-42c2-a8ff-1de7bc4f9219 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.443 144726 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3f19211d-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.443 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[b8c28486-dbc7-4676-bbc5-5bd53497e6ba]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.444 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[2c6c4160-8025-4612-885d-276c70eeaf17]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:06:36 compute-0 systemd-machined[77568]: New machine qemu-2-instance-00000002.
Oct 08 19:06:36 compute-0 systemd-udevd[145095]: Network interface NamePolicy= disabled on kernel command line.
Oct 08 19:06:36 compute-0 NetworkManager[1035]: <info>  [1759950396.4672] device (tap41ab28a1-92): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 08 19:06:36 compute-0 nova_compute[117514]: 2025-10-08 19:06:36.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:06:36 compute-0 NetworkManager[1035]: <info>  [1759950396.4704] device (tap41ab28a1-92): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.470 28783 DEBUG oslo.privsep.daemon [-] privsep: reply[edd2e2c5-1859-40ff-9d86-e580d560130a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:06:36 compute-0 systemd[1]: Started Virtual Machine qemu-2-instance-00000002.
Oct 08 19:06:36 compute-0 ovn_controller[19759]: 2025-10-08T19:06:36Z|00036|binding|INFO|Setting lport 41ab28a1-9254-46a6-97fc-2220fe30eccd ovn-installed in OVS
Oct 08 19:06:36 compute-0 ovn_controller[19759]: 2025-10-08T19:06:36Z|00037|binding|INFO|Setting lport 41ab28a1-9254-46a6-97fc-2220fe30eccd up in Southbound
Oct 08 19:06:36 compute-0 nova_compute[117514]: 2025-10-08 19:06:36.474 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.505 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[3f28e896-618a-4c56-81c3-2a4a161f7aa8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.538 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[7872bd47-4479-48f0-a849-9135d4168694]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.543 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[c807b19e-02c3-4857-b873-b74ec07c649c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:06:36 compute-0 NetworkManager[1035]: <info>  [1759950396.5454] manager: (tap3f19211d-10): new Veth device (/org/freedesktop/NetworkManager/Devices/31)
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.577 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[3c9c0af3-a790-4327-b448-b188d34b4443]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.580 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[a64e6b3d-d67d-4147-ab2b-3afb4bace699]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:06:36 compute-0 NetworkManager[1035]: <info>  [1759950396.6073] device (tap3f19211d-10): carrier: link connected
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.613 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[bbb40408-b3c3-4f29-89a1-7f4eb0ec7f6f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.631 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[d38d5944-fe21-4790-a3f3-7d6cc27ed853]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3f19211d-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5a:4d:14'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 20], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 106718, 'reachable_time': 16095, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 145127, 'error': None, 'target': 'ovnmeta-3f19211d-1888-42c2-a8ff-1de7bc4f9219', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.648 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[bda7f2f4-ef6f-412c-9407-1bb43816aa0f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5a:4d14'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 106718, 'tstamp': 106718}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 145128, 'error': None, 'target': 'ovnmeta-3f19211d-1888-42c2-a8ff-1de7bc4f9219', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.665 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[10cf1b51-d01c-4d1f-9ec4-71df8b6abb9c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3f19211d-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5a:4d:14'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 20], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 106718, 'reachable_time': 16095, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 145129, 'error': None, 'target': 'ovnmeta-3f19211d-1888-42c2-a8ff-1de7bc4f9219', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.698 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[f3cbc4e6-2828-4f3d-a530-5768b3e8310c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.789 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[1840d3ed-30c1-40cd-bd4c-0101c1148d8d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.792 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3f19211d-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.792 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.793 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3f19211d-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:06:36 compute-0 nova_compute[117514]: 2025-10-08 19:06:36.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:06:36 compute-0 kernel: tap3f19211d-10: entered promiscuous mode
Oct 08 19:06:36 compute-0 NetworkManager[1035]: <info>  [1759950396.7970] manager: (tap3f19211d-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/32)
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.800 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3f19211d-10, col_values=(('external_ids', {'iface-id': '794073fc-ca71-4a94-857f-c3e735aa1420'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:06:36 compute-0 ovn_controller[19759]: 2025-10-08T19:06:36Z|00038|binding|INFO|Releasing lport 794073fc-ca71-4a94-857f-c3e735aa1420 from this chassis (sb_readonly=0)
Oct 08 19:06:36 compute-0 nova_compute[117514]: 2025-10-08 19:06:36.802 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:06:36 compute-0 nova_compute[117514]: 2025-10-08 19:06:36.825 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:06:36 compute-0 nova_compute[117514]: 2025-10-08 19:06:36.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.828 28643 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3f19211d-1888-42c2-a8ff-1de7bc4f9219.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3f19211d-1888-42c2-a8ff-1de7bc4f9219.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.829 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[71f8d7f2-a57d-4c00-9239-57cc019cb22a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.830 28643 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]: global
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]:     log         /dev/log local0 debug
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]:     log-tag     haproxy-metadata-proxy-3f19211d-1888-42c2-a8ff-1de7bc4f9219
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]:     user        root
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]:     group       root
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]:     maxconn     1024
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]:     pidfile     /var/lib/neutron/external/pids/3f19211d-1888-42c2-a8ff-1de7bc4f9219.pid.haproxy
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]:     daemon
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]: 
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]: defaults
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]:     log global
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]:     mode http
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]:     option httplog
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]:     option dontlognull
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]:     option http-server-close
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]:     option forwardfor
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]:     retries                 3
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]:     timeout http-request    30s
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]:     timeout connect         30s
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]:     timeout client          32s
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]:     timeout server          32s
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]:     timeout http-keep-alive 30s
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]: 
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]: 
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]: listen listener
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]:     bind 169.254.169.254:80
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]:     server metadata /var/lib/neutron/metadata_proxy
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]:     http-request add-header X-OVN-Network-ID 3f19211d-1888-42c2-a8ff-1de7bc4f9219
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Oct 08 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.831 28643 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3f19211d-1888-42c2-a8ff-1de7bc4f9219', 'env', 'PROCESS_TAG=haproxy-3f19211d-1888-42c2-a8ff-1de7bc4f9219', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3f19211d-1888-42c2-a8ff-1de7bc4f9219.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Oct 08 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.260 2 DEBUG nova.compute.manager [req-ab570acd-d15f-4173-b8af-e954624565c4 req-9007d8fb-d184-4bd8-93eb-3dd9cf487754 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Received event network-vif-plugged-41ab28a1-9254-46a6-97fc-2220fe30eccd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.262 2 DEBUG oslo_concurrency.lockutils [req-ab570acd-d15f-4173-b8af-e954624565c4 req-9007d8fb-d184-4bd8-93eb-3dd9cf487754 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "432f298f-78dd-4e9e-9ee4-279c2bc544c1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.263 2 DEBUG oslo_concurrency.lockutils [req-ab570acd-d15f-4173-b8af-e954624565c4 req-9007d8fb-d184-4bd8-93eb-3dd9cf487754 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "432f298f-78dd-4e9e-9ee4-279c2bc544c1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.264 2 DEBUG oslo_concurrency.lockutils [req-ab570acd-d15f-4173-b8af-e954624565c4 req-9007d8fb-d184-4bd8-93eb-3dd9cf487754 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "432f298f-78dd-4e9e-9ee4-279c2bc544c1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.264 2 DEBUG nova.compute.manager [req-ab570acd-d15f-4173-b8af-e954624565c4 req-9007d8fb-d184-4bd8-93eb-3dd9cf487754 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Processing event network-vif-plugged-41ab28a1-9254-46a6-97fc-2220fe30eccd _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Oct 08 19:06:37 compute-0 podman[145169]: 2025-10-08 19:06:37.279423324 +0000 UTC m=+0.087014778 container create 2fa83a290b4ce32236b9a062240e91de3f4adbdabf9275d47c9271f48749cdd9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-3f19211d-1888-42c2-a8ff-1de7bc4f9219, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct 08 19:06:37 compute-0 podman[145169]: 2025-10-08 19:06:37.222967024 +0000 UTC m=+0.030558528 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7
Oct 08 19:06:37 compute-0 systemd[1]: Started libpod-conmon-2fa83a290b4ce32236b9a062240e91de3f4adbdabf9275d47c9271f48749cdd9.scope.
Oct 08 19:06:37 compute-0 systemd[1]: Started libcrun container.
Oct 08 19:06:37 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f952153b9c075c928f3a4048d93d2e597dc86bce3878df25663d6f266130ce7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 08 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.377 2 DEBUG nova.compute.manager [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Oct 08 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.379 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950397.3764095, 432f298f-78dd-4e9e-9ee4-279c2bc544c1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 08 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.379 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] VM Started (Lifecycle Event)
Oct 08 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.382 2 DEBUG nova.virt.libvirt.driver [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Oct 08 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.386 2 INFO nova.virt.libvirt.driver [-] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Instance spawned successfully.
Oct 08 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.386 2 DEBUG nova.virt.libvirt.driver [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Oct 08 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.396 2 DEBUG nova.network.neutron [req-cb7578e8-60fb-4c89-b616-ba692ab08795 req-da88d31b-49e2-4ee5-b7f5-8e9a8d101196 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Updated VIF entry in instance network info cache for port 41ab28a1-9254-46a6-97fc-2220fe30eccd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 08 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.397 2 DEBUG nova.network.neutron [req-cb7578e8-60fb-4c89-b616-ba692ab08795 req-da88d31b-49e2-4ee5-b7f5-8e9a8d101196 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Updating instance_info_cache with network_info: [{"id": "41ab28a1-9254-46a6-97fc-2220fe30eccd", "address": "fa:16:3e:a4:ab:8d", "network": {"id": "3f19211d-1888-42c2-a8ff-1de7bc4f9219", "bridge": "br-int", "label": "tempest-network-smoke--1824442985", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41ab28a1-92", "ovs_interfaceid": "41ab28a1-9254-46a6-97fc-2220fe30eccd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:06:37 compute-0 podman[145169]: 2025-10-08 19:06:37.397599435 +0000 UTC m=+0.205190959 container init 2fa83a290b4ce32236b9a062240e91de3f4adbdabf9275d47c9271f48749cdd9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-3f19211d-1888-42c2-a8ff-1de7bc4f9219, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct 08 19:06:37 compute-0 podman[145169]: 2025-10-08 19:06:37.404151893 +0000 UTC m=+0.211743347 container start 2fa83a290b4ce32236b9a062240e91de3f4adbdabf9275d47c9271f48749cdd9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-3f19211d-1888-42c2-a8ff-1de7bc4f9219, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2)
Oct 08 19:06:37 compute-0 neutron-haproxy-ovnmeta-3f19211d-1888-42c2-a8ff-1de7bc4f9219[145184]: [NOTICE]   (145188) : New worker (145190) forked
Oct 08 19:06:37 compute-0 neutron-haproxy-ovnmeta-3f19211d-1888-42c2-a8ff-1de7bc4f9219[145184]: [NOTICE]   (145188) : Loading success.
Oct 08 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.432 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.436 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 08 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.481 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 08 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.482 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950397.3766084, 432f298f-78dd-4e9e-9ee4-279c2bc544c1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 08 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.482 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] VM Paused (Lifecycle Event)
Oct 08 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.485 2 DEBUG oslo_concurrency.lockutils [req-cb7578e8-60fb-4c89-b616-ba692ab08795 req-da88d31b-49e2-4ee5-b7f5-8e9a8d101196 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-432f298f-78dd-4e9e-9ee4-279c2bc544c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.491 2 DEBUG nova.virt.libvirt.driver [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.492 2 DEBUG nova.virt.libvirt.driver [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.492 2 DEBUG nova.virt.libvirt.driver [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.493 2 DEBUG nova.virt.libvirt.driver [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.493 2 DEBUG nova.virt.libvirt.driver [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.493 2 DEBUG nova.virt.libvirt.driver [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.498 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.501 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950397.3810613, 432f298f-78dd-4e9e-9ee4-279c2bc544c1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 08 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.501 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] VM Resumed (Lifecycle Event)
Oct 08 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.524 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.528 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 08 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.548 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 08 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.560 2 INFO nova.compute.manager [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Took 7.68 seconds to spawn the instance on the hypervisor.
Oct 08 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.561 2 DEBUG nova.compute.manager [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.645 2 INFO nova.compute.manager [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Took 8.18 seconds to build instance.
Oct 08 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.664 2 DEBUG oslo_concurrency.lockutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "432f298f-78dd-4e9e-9ee4-279c2bc544c1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.269s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:06:39 compute-0 nova_compute[117514]: 2025-10-08 19:06:39.347 2 DEBUG nova.compute.manager [req-6cfb2163-bb22-46bd-b4b8-df981f56858f req-b59014ed-836e-4321-8a59-e516b4ef4969 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Received event network-vif-plugged-41ab28a1-9254-46a6-97fc-2220fe30eccd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:06:39 compute-0 nova_compute[117514]: 2025-10-08 19:06:39.348 2 DEBUG oslo_concurrency.lockutils [req-6cfb2163-bb22-46bd-b4b8-df981f56858f req-b59014ed-836e-4321-8a59-e516b4ef4969 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "432f298f-78dd-4e9e-9ee4-279c2bc544c1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:06:39 compute-0 nova_compute[117514]: 2025-10-08 19:06:39.348 2 DEBUG oslo_concurrency.lockutils [req-6cfb2163-bb22-46bd-b4b8-df981f56858f req-b59014ed-836e-4321-8a59-e516b4ef4969 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "432f298f-78dd-4e9e-9ee4-279c2bc544c1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:06:39 compute-0 nova_compute[117514]: 2025-10-08 19:06:39.349 2 DEBUG oslo_concurrency.lockutils [req-6cfb2163-bb22-46bd-b4b8-df981f56858f req-b59014ed-836e-4321-8a59-e516b4ef4969 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "432f298f-78dd-4e9e-9ee4-279c2bc544c1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:06:39 compute-0 nova_compute[117514]: 2025-10-08 19:06:39.349 2 DEBUG nova.compute.manager [req-6cfb2163-bb22-46bd-b4b8-df981f56858f req-b59014ed-836e-4321-8a59-e516b4ef4969 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] No waiting events found dispatching network-vif-plugged-41ab28a1-9254-46a6-97fc-2220fe30eccd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 08 19:06:39 compute-0 nova_compute[117514]: 2025-10-08 19:06:39.350 2 WARNING nova.compute.manager [req-6cfb2163-bb22-46bd-b4b8-df981f56858f req-b59014ed-836e-4321-8a59-e516b4ef4969 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Received unexpected event network-vif-plugged-41ab28a1-9254-46a6-97fc-2220fe30eccd for instance with vm_state active and task_state None.
Oct 08 19:06:40 compute-0 nova_compute[117514]: 2025-10-08 19:06:40.126 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:06:40 compute-0 nova_compute[117514]: 2025-10-08 19:06:40.392 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:06:42 compute-0 nova_compute[117514]: 2025-10-08 19:06:42.444 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:06:42 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:42.444 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a6:75:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '5e:14:dd:63:55:2a'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 08 19:06:42 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:42.447 28643 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 08 19:06:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:44.226 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:06:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:44.227 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:06:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:44.228 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:06:44 compute-0 podman[145199]: 2025-10-08 19:06:44.704633089 +0000 UTC m=+0.114126976 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct 08 19:06:45 compute-0 nova_compute[117514]: 2025-10-08 19:06:45.132 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:06:45 compute-0 nova_compute[117514]: 2025-10-08 19:06:45.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:06:49 compute-0 ovn_controller[19759]: 2025-10-08T19:06:49Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a4:ab:8d 10.100.0.25
Oct 08 19:06:49 compute-0 ovn_controller[19759]: 2025-10-08T19:06:49Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a4:ab:8d 10.100.0.25
Oct 08 19:06:49 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:49.450 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f81f7a-64d8-418a-a74c-b879bd6deb83, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:06:50 compute-0 nova_compute[117514]: 2025-10-08 19:06:50.136 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:06:50 compute-0 nova_compute[117514]: 2025-10-08 19:06:50.398 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:06:50 compute-0 podman[145228]: 2025-10-08 19:06:50.697978572 +0000 UTC m=+0.110521463 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, org.label-schema.schema-version=1.0)
Oct 08 19:06:50 compute-0 podman[145227]: 2025-10-08 19:06:50.698045984 +0000 UTC m=+0.116467774 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, version=9.6, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_id=edpm)
Oct 08 19:06:54 compute-0 podman[145265]: 2025-10-08 19:06:54.67490662 +0000 UTC m=+0.086161813 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 08 19:06:55 compute-0 nova_compute[117514]: 2025-10-08 19:06:55.141 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:06:55 compute-0 nova_compute[117514]: 2025-10-08 19:06:55.401 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:06:56 compute-0 podman[145290]: 2025-10-08 19:06:56.671512838 +0000 UTC m=+0.084151756 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid, managed_by=edpm_ansible, tcib_managed=true, config_id=iscsid)
Oct 08 19:06:56 compute-0 podman[145291]: 2025-10-08 19:06:56.6802778 +0000 UTC m=+0.080792720 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Oct 08 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.346 2 DEBUG oslo_concurrency.lockutils [None req-978d42a9-cdf9-4edc-ba0a-669fa9cd9ff2 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "432f298f-78dd-4e9e-9ee4-279c2bc544c1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.347 2 DEBUG oslo_concurrency.lockutils [None req-978d42a9-cdf9-4edc-ba0a-669fa9cd9ff2 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "432f298f-78dd-4e9e-9ee4-279c2bc544c1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.348 2 DEBUG oslo_concurrency.lockutils [None req-978d42a9-cdf9-4edc-ba0a-669fa9cd9ff2 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "432f298f-78dd-4e9e-9ee4-279c2bc544c1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.348 2 DEBUG oslo_concurrency.lockutils [None req-978d42a9-cdf9-4edc-ba0a-669fa9cd9ff2 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "432f298f-78dd-4e9e-9ee4-279c2bc544c1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.349 2 DEBUG oslo_concurrency.lockutils [None req-978d42a9-cdf9-4edc-ba0a-669fa9cd9ff2 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "432f298f-78dd-4e9e-9ee4-279c2bc544c1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.351 2 INFO nova.compute.manager [None req-978d42a9-cdf9-4edc-ba0a-669fa9cd9ff2 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Terminating instance
Oct 08 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.353 2 DEBUG nova.compute.manager [None req-978d42a9-cdf9-4edc-ba0a-669fa9cd9ff2 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Oct 08 19:06:58 compute-0 kernel: tap41ab28a1-92 (unregistering): left promiscuous mode
Oct 08 19:06:58 compute-0 NetworkManager[1035]: <info>  [1759950418.3824] device (tap41ab28a1-92): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 08 19:06:58 compute-0 ovn_controller[19759]: 2025-10-08T19:06:58Z|00039|binding|INFO|Releasing lport 41ab28a1-9254-46a6-97fc-2220fe30eccd from this chassis (sb_readonly=0)
Oct 08 19:06:58 compute-0 ovn_controller[19759]: 2025-10-08T19:06:58Z|00040|binding|INFO|Setting lport 41ab28a1-9254-46a6-97fc-2220fe30eccd down in Southbound
Oct 08 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:06:58 compute-0 ovn_controller[19759]: 2025-10-08T19:06:58Z|00041|binding|INFO|Removing iface tap41ab28a1-92 ovn-installed in OVS
Oct 08 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:06:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:58.455 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a4:ab:8d 10.100.0.25'], port_security=['fa:16:3e:a4:ab:8d 10.100.0.25'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.25/28', 'neutron:device_id': '432f298f-78dd-4e9e-9ee4-279c2bc544c1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3f19211d-1888-42c2-a8ff-1de7bc4f9219', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b1b038d5-57f5-4b2c-9de0-90d7e6862c10', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=80124168-1b37-4a7c-9765-130e2be44549, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=41ab28a1-9254-46a6-97fc-2220fe30eccd) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 08 19:06:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:58.457 28643 INFO neutron.agent.ovn.metadata.agent [-] Port 41ab28a1-9254-46a6-97fc-2220fe30eccd in datapath 3f19211d-1888-42c2-a8ff-1de7bc4f9219 unbound from our chassis
Oct 08 19:06:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:58.458 28643 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3f19211d-1888-42c2-a8ff-1de7bc4f9219, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 08 19:06:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:58.459 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[39b73246-bd46-4862-b6a7-201443236285]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:06:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:58.459 28643 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3f19211d-1888-42c2-a8ff-1de7bc4f9219 namespace which is not needed anymore
Oct 08 19:06:58 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000002.scope: Deactivated successfully.
Oct 08 19:06:58 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000002.scope: Consumed 12.536s CPU time.
Oct 08 19:06:58 compute-0 systemd-machined[77568]: Machine qemu-2-instance-00000002 terminated.
Oct 08 19:06:58 compute-0 podman[145328]: 2025-10-08 19:06:58.595295305 +0000 UTC m=+0.154964368 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Oct 08 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.628 2 INFO nova.virt.libvirt.driver [-] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Instance destroyed successfully.
Oct 08 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.629 2 DEBUG nova.objects.instance [None req-978d42a9-cdf9-4edc-ba0a-669fa9cd9ff2 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'resources' on Instance uuid 432f298f-78dd-4e9e-9ee4-279c2bc544c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 08 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.648 2 DEBUG nova.virt.libvirt.vif [None req-978d42a9-cdf9-4edc-ba0a-669fa9cd9ff2 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T19:06:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-389250187',display_name='tempest-TestNetworkBasicOps-server-389250187',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-389250187',id=2,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAad1Pe21vvEy0SgHMu1VF6n4SO9ujKCPB0SgACJ8nvOuhW/VjCSPOSETWk3+gFjb/KHaSwvZLGtfcSFz4SkdC0dg68nGstzwyBghc627R2c2cxKu7YHJFmDoK+RJ/yIQg==',key_name='tempest-TestNetworkBasicOps-1028595893',keypairs=<?>,launch_index=0,launched_at=2025-10-08T19:06:37Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-e7tbigki',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T19:06:37Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=432f298f-78dd-4e9e-9ee4-279c2bc544c1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "41ab28a1-9254-46a6-97fc-2220fe30eccd", "address": "fa:16:3e:a4:ab:8d", "network": {"id": "3f19211d-1888-42c2-a8ff-1de7bc4f9219", "bridge": "br-int", "label": "tempest-network-smoke--1824442985", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41ab28a1-92", "ovs_interfaceid": "41ab28a1-9254-46a6-97fc-2220fe30eccd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Oct 08 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.649 2 DEBUG nova.network.os_vif_util [None req-978d42a9-cdf9-4edc-ba0a-669fa9cd9ff2 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "41ab28a1-9254-46a6-97fc-2220fe30eccd", "address": "fa:16:3e:a4:ab:8d", "network": {"id": "3f19211d-1888-42c2-a8ff-1de7bc4f9219", "bridge": "br-int", "label": "tempest-network-smoke--1824442985", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41ab28a1-92", "ovs_interfaceid": "41ab28a1-9254-46a6-97fc-2220fe30eccd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 08 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.649 2 DEBUG nova.network.os_vif_util [None req-978d42a9-cdf9-4edc-ba0a-669fa9cd9ff2 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a4:ab:8d,bridge_name='br-int',has_traffic_filtering=True,id=41ab28a1-9254-46a6-97fc-2220fe30eccd,network=Network(3f19211d-1888-42c2-a8ff-1de7bc4f9219),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41ab28a1-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 08 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.650 2 DEBUG os_vif [None req-978d42a9-cdf9-4edc-ba0a-669fa9cd9ff2 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a4:ab:8d,bridge_name='br-int',has_traffic_filtering=True,id=41ab28a1-9254-46a6-97fc-2220fe30eccd,network=Network(3f19211d-1888-42c2-a8ff-1de7bc4f9219),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41ab28a1-92') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Oct 08 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.652 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.653 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap41ab28a1-92, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.655 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:06:58 compute-0 neutron-haproxy-ovnmeta-3f19211d-1888-42c2-a8ff-1de7bc4f9219[145184]: [NOTICE]   (145188) : haproxy version is 2.8.14-c23fe91
Oct 08 19:06:58 compute-0 neutron-haproxy-ovnmeta-3f19211d-1888-42c2-a8ff-1de7bc4f9219[145184]: [NOTICE]   (145188) : path to executable is /usr/sbin/haproxy
Oct 08 19:06:58 compute-0 neutron-haproxy-ovnmeta-3f19211d-1888-42c2-a8ff-1de7bc4f9219[145184]: [WARNING]  (145188) : Exiting Master process...
Oct 08 19:06:58 compute-0 neutron-haproxy-ovnmeta-3f19211d-1888-42c2-a8ff-1de7bc4f9219[145184]: [WARNING]  (145188) : Exiting Master process...
Oct 08 19:06:58 compute-0 neutron-haproxy-ovnmeta-3f19211d-1888-42c2-a8ff-1de7bc4f9219[145184]: [ALERT]    (145188) : Current worker (145190) exited with code 143 (Terminated)
Oct 08 19:06:58 compute-0 neutron-haproxy-ovnmeta-3f19211d-1888-42c2-a8ff-1de7bc4f9219[145184]: [WARNING]  (145188) : All workers exited. Exiting... (0)
Oct 08 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.661 2 INFO os_vif [None req-978d42a9-cdf9-4edc-ba0a-669fa9cd9ff2 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a4:ab:8d,bridge_name='br-int',has_traffic_filtering=True,id=41ab28a1-9254-46a6-97fc-2220fe30eccd,network=Network(3f19211d-1888-42c2-a8ff-1de7bc4f9219),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41ab28a1-92')
Oct 08 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.662 2 INFO nova.virt.libvirt.driver [None req-978d42a9-cdf9-4edc-ba0a-669fa9cd9ff2 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Deleting instance files /var/lib/nova/instances/432f298f-78dd-4e9e-9ee4-279c2bc544c1_del
Oct 08 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.663 2 INFO nova.virt.libvirt.driver [None req-978d42a9-cdf9-4edc-ba0a-669fa9cd9ff2 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Deletion of /var/lib/nova/instances/432f298f-78dd-4e9e-9ee4-279c2bc544c1_del complete
Oct 08 19:06:58 compute-0 systemd[1]: libpod-2fa83a290b4ce32236b9a062240e91de3f4adbdabf9275d47c9271f48749cdd9.scope: Deactivated successfully.
Oct 08 19:06:58 compute-0 podman[145384]: 2025-10-08 19:06:58.66863911 +0000 UTC m=+0.056892664 container died 2fa83a290b4ce32236b9a062240e91de3f4adbdabf9275d47c9271f48749cdd9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-3f19211d-1888-42c2-a8ff-1de7bc4f9219, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct 08 19:06:58 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2fa83a290b4ce32236b9a062240e91de3f4adbdabf9275d47c9271f48749cdd9-userdata-shm.mount: Deactivated successfully.
Oct 08 19:06:58 compute-0 systemd[1]: var-lib-containers-storage-overlay-4f952153b9c075c928f3a4048d93d2e597dc86bce3878df25663d6f266130ce7-merged.mount: Deactivated successfully.
Oct 08 19:06:58 compute-0 podman[145384]: 2025-10-08 19:06:58.723165665 +0000 UTC m=+0.111419189 container cleanup 2fa83a290b4ce32236b9a062240e91de3f4adbdabf9275d47c9271f48749cdd9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-3f19211d-1888-42c2-a8ff-1de7bc4f9219, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct 08 19:06:58 compute-0 systemd[1]: libpod-conmon-2fa83a290b4ce32236b9a062240e91de3f4adbdabf9275d47c9271f48749cdd9.scope: Deactivated successfully.
Oct 08 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.741 2 DEBUG nova.virt.libvirt.host [None req-978d42a9-cdf9-4edc-ba0a-669fa9cd9ff2 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754
Oct 08 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.742 2 INFO nova.virt.libvirt.host [None req-978d42a9-cdf9-4edc-ba0a-669fa9cd9ff2 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] UEFI support detected
Oct 08 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.744 2 INFO nova.compute.manager [None req-978d42a9-cdf9-4edc-ba0a-669fa9cd9ff2 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Took 0.39 seconds to destroy the instance on the hypervisor.
Oct 08 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.745 2 DEBUG oslo.service.loopingcall [None req-978d42a9-cdf9-4edc-ba0a-669fa9cd9ff2 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Oct 08 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.745 2 DEBUG nova.compute.manager [-] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Oct 08 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.745 2 DEBUG nova.network.neutron [-] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Oct 08 19:06:58 compute-0 podman[145431]: 2025-10-08 19:06:58.782675883 +0000 UTC m=+0.038022783 container remove 2fa83a290b4ce32236b9a062240e91de3f4adbdabf9275d47c9271f48749cdd9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-3f19211d-1888-42c2-a8ff-1de7bc4f9219, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct 08 19:06:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:58.787 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[6ba1b0a0-47ee-4474-812e-289f9269ad36]: (4, ('Wed Oct  8 07:06:58 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-3f19211d-1888-42c2-a8ff-1de7bc4f9219 (2fa83a290b4ce32236b9a062240e91de3f4adbdabf9275d47c9271f48749cdd9)\n2fa83a290b4ce32236b9a062240e91de3f4adbdabf9275d47c9271f48749cdd9\nWed Oct  8 07:06:58 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-3f19211d-1888-42c2-a8ff-1de7bc4f9219 (2fa83a290b4ce32236b9a062240e91de3f4adbdabf9275d47c9271f48749cdd9)\n2fa83a290b4ce32236b9a062240e91de3f4adbdabf9275d47c9271f48749cdd9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:06:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:58.789 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[d0361c33-fff9-4abc-b4b2-487efb4b6e3b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:06:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:58.790 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3f19211d-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.792 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:06:58 compute-0 kernel: tap3f19211d-10: left promiscuous mode
Oct 08 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.807 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:06:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:58.810 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[5ba43561-b88e-407c-89bb-5fd74bb0a814]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:06:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:58.841 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[a7364aa4-f0d8-4501-b04a-023d3c4abf4e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:06:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:58.842 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[c6757f75-0dee-406c-96c4-9f805a4c184d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:06:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:58.858 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[01754535-b816-4d2a-9cab-aa716e37c858]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 106711, 'reachable_time': 15840, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 145446, 'error': None, 'target': 'ovnmeta-3f19211d-1888-42c2-a8ff-1de7bc4f9219', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:06:58 compute-0 systemd[1]: run-netns-ovnmeta\x2d3f19211d\x2d1888\x2d42c2\x2da8ff\x2d1de7bc4f9219.mount: Deactivated successfully.
Oct 08 19:06:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:58.867 28783 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3f19211d-1888-42c2-a8ff-1de7bc4f9219 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Oct 08 19:06:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:58.868 28783 DEBUG oslo.privsep.daemon [-] privsep: reply[67f5ffe8-159b-48ea-8e2c-783ab259a6e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:06:59 compute-0 nova_compute[117514]: 2025-10-08 19:06:59.247 2 DEBUG nova.compute.manager [req-02bd1e3e-7c75-4728-bd73-60dfc30a339b req-ea9de67c-e7c2-4e6f-8b06-031314e5d9da bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Received event network-vif-unplugged-41ab28a1-9254-46a6-97fc-2220fe30eccd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:06:59 compute-0 nova_compute[117514]: 2025-10-08 19:06:59.248 2 DEBUG oslo_concurrency.lockutils [req-02bd1e3e-7c75-4728-bd73-60dfc30a339b req-ea9de67c-e7c2-4e6f-8b06-031314e5d9da bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "432f298f-78dd-4e9e-9ee4-279c2bc544c1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:06:59 compute-0 nova_compute[117514]: 2025-10-08 19:06:59.248 2 DEBUG oslo_concurrency.lockutils [req-02bd1e3e-7c75-4728-bd73-60dfc30a339b req-ea9de67c-e7c2-4e6f-8b06-031314e5d9da bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "432f298f-78dd-4e9e-9ee4-279c2bc544c1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:06:59 compute-0 nova_compute[117514]: 2025-10-08 19:06:59.249 2 DEBUG oslo_concurrency.lockutils [req-02bd1e3e-7c75-4728-bd73-60dfc30a339b req-ea9de67c-e7c2-4e6f-8b06-031314e5d9da bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "432f298f-78dd-4e9e-9ee4-279c2bc544c1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:06:59 compute-0 nova_compute[117514]: 2025-10-08 19:06:59.249 2 DEBUG nova.compute.manager [req-02bd1e3e-7c75-4728-bd73-60dfc30a339b req-ea9de67c-e7c2-4e6f-8b06-031314e5d9da bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] No waiting events found dispatching network-vif-unplugged-41ab28a1-9254-46a6-97fc-2220fe30eccd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 08 19:06:59 compute-0 nova_compute[117514]: 2025-10-08 19:06:59.250 2 DEBUG nova.compute.manager [req-02bd1e3e-7c75-4728-bd73-60dfc30a339b req-ea9de67c-e7c2-4e6f-8b06-031314e5d9da bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Received event network-vif-unplugged-41ab28a1-9254-46a6-97fc-2220fe30eccd for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Oct 08 19:06:59 compute-0 nova_compute[117514]: 2025-10-08 19:06:59.497 2 DEBUG nova.network.neutron [-] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:06:59 compute-0 nova_compute[117514]: 2025-10-08 19:06:59.516 2 INFO nova.compute.manager [-] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Took 0.77 seconds to deallocate network for instance.
Oct 08 19:06:59 compute-0 nova_compute[117514]: 2025-10-08 19:06:59.563 2 DEBUG oslo_concurrency.lockutils [None req-978d42a9-cdf9-4edc-ba0a-669fa9cd9ff2 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:06:59 compute-0 nova_compute[117514]: 2025-10-08 19:06:59.564 2 DEBUG oslo_concurrency.lockutils [None req-978d42a9-cdf9-4edc-ba0a-669fa9cd9ff2 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:06:59 compute-0 nova_compute[117514]: 2025-10-08 19:06:59.652 2 DEBUG nova.compute.provider_tree [None req-978d42a9-cdf9-4edc-ba0a-669fa9cd9ff2 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 08 19:06:59 compute-0 nova_compute[117514]: 2025-10-08 19:06:59.669 2 DEBUG nova.scheduler.client.report [None req-978d42a9-cdf9-4edc-ba0a-669fa9cd9ff2 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 08 19:06:59 compute-0 nova_compute[117514]: 2025-10-08 19:06:59.694 2 DEBUG oslo_concurrency.lockutils [None req-978d42a9-cdf9-4edc-ba0a-669fa9cd9ff2 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:06:59 compute-0 nova_compute[117514]: 2025-10-08 19:06:59.735 2 INFO nova.scheduler.client.report [None req-978d42a9-cdf9-4edc-ba0a-669fa9cd9ff2 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Deleted allocations for instance 432f298f-78dd-4e9e-9ee4-279c2bc544c1
Oct 08 19:06:59 compute-0 nova_compute[117514]: 2025-10-08 19:06:59.835 2 DEBUG oslo_concurrency.lockutils [None req-978d42a9-cdf9-4edc-ba0a-669fa9cd9ff2 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "432f298f-78dd-4e9e-9ee4-279c2bc544c1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.488s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:06:59 compute-0 nova_compute[117514]: 2025-10-08 19:06:59.923 2 DEBUG nova.compute.manager [req-1cc77f87-0150-409d-b44d-bce4b450fac0 req-035ce338-ced7-45e7-9a63-0ca5d5569dae bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Received event network-vif-deleted-41ab28a1-9254-46a6-97fc-2220fe30eccd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:07:00 compute-0 nova_compute[117514]: 2025-10-08 19:07:00.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:07:01 compute-0 nova_compute[117514]: 2025-10-08 19:07:01.322 2 DEBUG nova.compute.manager [req-57dea52f-ce0d-4477-bf92-4a92af692b04 req-da1b2d6e-4910-4d8c-b0fb-54585411dcd6 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Received event network-vif-plugged-41ab28a1-9254-46a6-97fc-2220fe30eccd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:07:01 compute-0 nova_compute[117514]: 2025-10-08 19:07:01.323 2 DEBUG oslo_concurrency.lockutils [req-57dea52f-ce0d-4477-bf92-4a92af692b04 req-da1b2d6e-4910-4d8c-b0fb-54585411dcd6 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "432f298f-78dd-4e9e-9ee4-279c2bc544c1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:07:01 compute-0 nova_compute[117514]: 2025-10-08 19:07:01.323 2 DEBUG oslo_concurrency.lockutils [req-57dea52f-ce0d-4477-bf92-4a92af692b04 req-da1b2d6e-4910-4d8c-b0fb-54585411dcd6 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "432f298f-78dd-4e9e-9ee4-279c2bc544c1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:07:01 compute-0 nova_compute[117514]: 2025-10-08 19:07:01.324 2 DEBUG oslo_concurrency.lockutils [req-57dea52f-ce0d-4477-bf92-4a92af692b04 req-da1b2d6e-4910-4d8c-b0fb-54585411dcd6 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "432f298f-78dd-4e9e-9ee4-279c2bc544c1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:07:01 compute-0 nova_compute[117514]: 2025-10-08 19:07:01.324 2 DEBUG nova.compute.manager [req-57dea52f-ce0d-4477-bf92-4a92af692b04 req-da1b2d6e-4910-4d8c-b0fb-54585411dcd6 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] No waiting events found dispatching network-vif-plugged-41ab28a1-9254-46a6-97fc-2220fe30eccd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 08 19:07:01 compute-0 nova_compute[117514]: 2025-10-08 19:07:01.325 2 WARNING nova.compute.manager [req-57dea52f-ce0d-4477-bf92-4a92af692b04 req-da1b2d6e-4910-4d8c-b0fb-54585411dcd6 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Received unexpected event network-vif-plugged-41ab28a1-9254-46a6-97fc-2220fe30eccd for instance with vm_state deleted and task_state None.
Oct 08 19:07:02 compute-0 ovn_controller[19759]: 2025-10-08T19:07:02Z|00042|binding|INFO|Releasing lport f9878aab-28ef-456a-a43a-7cacc2381b1f from this chassis (sb_readonly=0)
Oct 08 19:07:02 compute-0 nova_compute[117514]: 2025-10-08 19:07:02.238 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.373 2 DEBUG oslo_concurrency.lockutils [None req-41a5d4ad-2c21-407b-b153-89b5f2fa7d6a efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "533c431a-8ae8-4310-81dc-29285b78f93c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.373 2 DEBUG oslo_concurrency.lockutils [None req-41a5d4ad-2c21-407b-b153-89b5f2fa7d6a efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "533c431a-8ae8-4310-81dc-29285b78f93c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.374 2 DEBUG oslo_concurrency.lockutils [None req-41a5d4ad-2c21-407b-b153-89b5f2fa7d6a efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "533c431a-8ae8-4310-81dc-29285b78f93c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.374 2 DEBUG oslo_concurrency.lockutils [None req-41a5d4ad-2c21-407b-b153-89b5f2fa7d6a efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "533c431a-8ae8-4310-81dc-29285b78f93c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.375 2 DEBUG oslo_concurrency.lockutils [None req-41a5d4ad-2c21-407b-b153-89b5f2fa7d6a efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "533c431a-8ae8-4310-81dc-29285b78f93c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.377 2 INFO nova.compute.manager [None req-41a5d4ad-2c21-407b-b153-89b5f2fa7d6a efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Terminating instance
Oct 08 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.379 2 DEBUG nova.compute.manager [None req-41a5d4ad-2c21-407b-b153-89b5f2fa7d6a efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Oct 08 19:07:03 compute-0 kernel: tap82f4743a-dc (unregistering): left promiscuous mode
Oct 08 19:07:03 compute-0 NetworkManager[1035]: <info>  [1759950423.4083] device (tap82f4743a-dc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 08 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:07:03 compute-0 ovn_controller[19759]: 2025-10-08T19:07:03Z|00043|binding|INFO|Releasing lport 82f4743a-dcdc-49f7-be61-94d565e29842 from this chassis (sb_readonly=0)
Oct 08 19:07:03 compute-0 ovn_controller[19759]: 2025-10-08T19:07:03Z|00044|binding|INFO|Setting lport 82f4743a-dcdc-49f7-be61-94d565e29842 down in Southbound
Oct 08 19:07:03 compute-0 ovn_controller[19759]: 2025-10-08T19:07:03Z|00045|binding|INFO|Removing iface tap82f4743a-dc ovn-installed in OVS
Oct 08 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.423 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:07:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:03.431 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2e:6b:6c 10.100.0.3'], port_security=['fa:16:3e:2e:6b:6c 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a913b285-6d0a-478e-aa24-18bb458d8f7a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd3706646-002b-4286-ab41-a86fd84e3356', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=30f96b84-f723-4541-a1ae-463e873ff4a9, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=82f4743a-dcdc-49f7-be61-94d565e29842) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 08 19:07:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:03.434 28643 INFO neutron.agent.ovn.metadata.agent [-] Port 82f4743a-dcdc-49f7-be61-94d565e29842 in datapath a913b285-6d0a-478e-aa24-18bb458d8f7a unbound from our chassis
Oct 08 19:07:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:03.436 28643 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a913b285-6d0a-478e-aa24-18bb458d8f7a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 08 19:07:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:03.437 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[e530d9ac-2335-4bfe-bf2a-d1076203377e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:07:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:03.438 28643 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a913b285-6d0a-478e-aa24-18bb458d8f7a namespace which is not needed anymore
Oct 08 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.449 2 DEBUG nova.compute.manager [req-70979864-b68c-4cf6-a572-e745794d451f req-acb49dd7-77f0-4780-b68c-54f5ba7035f8 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Received event network-changed-82f4743a-dcdc-49f7-be61-94d565e29842 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.450 2 DEBUG nova.compute.manager [req-70979864-b68c-4cf6-a572-e745794d451f req-acb49dd7-77f0-4780-b68c-54f5ba7035f8 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Refreshing instance network info cache due to event network-changed-82f4743a-dcdc-49f7-be61-94d565e29842. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 08 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.451 2 DEBUG oslo_concurrency.lockutils [req-70979864-b68c-4cf6-a572-e745794d451f req-acb49dd7-77f0-4780-b68c-54f5ba7035f8 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-533c431a-8ae8-4310-81dc-29285b78f93c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.452 2 DEBUG oslo_concurrency.lockutils [req-70979864-b68c-4cf6-a572-e745794d451f req-acb49dd7-77f0-4780-b68c-54f5ba7035f8 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-533c431a-8ae8-4310-81dc-29285b78f93c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.452 2 DEBUG nova.network.neutron [req-70979864-b68c-4cf6-a572-e745794d451f req-acb49dd7-77f0-4780-b68c-54f5ba7035f8 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Refreshing network info cache for port 82f4743a-dcdc-49f7-be61-94d565e29842 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 08 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.456 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:07:03 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Deactivated successfully.
Oct 08 19:07:03 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Consumed 15.272s CPU time.
Oct 08 19:07:03 compute-0 systemd-machined[77568]: Machine qemu-1-instance-00000001 terminated.
Oct 08 19:07:03 compute-0 neutron-haproxy-ovnmeta-a913b285-6d0a-478e-aa24-18bb458d8f7a[144818]: [NOTICE]   (144822) : haproxy version is 2.8.14-c23fe91
Oct 08 19:07:03 compute-0 neutron-haproxy-ovnmeta-a913b285-6d0a-478e-aa24-18bb458d8f7a[144818]: [NOTICE]   (144822) : path to executable is /usr/sbin/haproxy
Oct 08 19:07:03 compute-0 neutron-haproxy-ovnmeta-a913b285-6d0a-478e-aa24-18bb458d8f7a[144818]: [WARNING]  (144822) : Exiting Master process...
Oct 08 19:07:03 compute-0 neutron-haproxy-ovnmeta-a913b285-6d0a-478e-aa24-18bb458d8f7a[144818]: [WARNING]  (144822) : Exiting Master process...
Oct 08 19:07:03 compute-0 neutron-haproxy-ovnmeta-a913b285-6d0a-478e-aa24-18bb458d8f7a[144818]: [ALERT]    (144822) : Current worker (144824) exited with code 143 (Terminated)
Oct 08 19:07:03 compute-0 neutron-haproxy-ovnmeta-a913b285-6d0a-478e-aa24-18bb458d8f7a[144818]: [WARNING]  (144822) : All workers exited. Exiting... (0)
Oct 08 19:07:03 compute-0 systemd[1]: libpod-3efae6d2598078a444d1d0b5df7fb7ce2c474b330f618d9a2595c2a8e415d9fa.scope: Deactivated successfully.
Oct 08 19:07:03 compute-0 podman[145472]: 2025-10-08 19:07:03.641627421 +0000 UTC m=+0.072023828 container died 3efae6d2598078a444d1d0b5df7fb7ce2c474b330f618d9a2595c2a8e415d9fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-a913b285-6d0a-478e-aa24-18bb458d8f7a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS)
Oct 08 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.661 2 INFO nova.virt.libvirt.driver [-] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Instance destroyed successfully.
Oct 08 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.661 2 DEBUG nova.objects.instance [None req-41a5d4ad-2c21-407b-b153-89b5f2fa7d6a efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'resources' on Instance uuid 533c431a-8ae8-4310-81dc-29285b78f93c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 08 19:07:03 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3efae6d2598078a444d1d0b5df7fb7ce2c474b330f618d9a2595c2a8e415d9fa-userdata-shm.mount: Deactivated successfully.
Oct 08 19:07:03 compute-0 systemd[1]: var-lib-containers-storage-overlay-d18ad23441d9b789fc1d272bc7da5f2175a2c9c00ba472ac96b05443d421d00c-merged.mount: Deactivated successfully.
Oct 08 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.683 2 DEBUG nova.virt.libvirt.vif [None req-41a5d4ad-2c21-407b-b153-89b5f2fa7d6a efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T19:05:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-447228763',display_name='tempest-TestNetworkBasicOps-server-447228763',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-447228763',id=1,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEkUPXM3K1FQRSOHUI4ceK1l6cbpFonPXFALKMkZcGgnSoRiUTQsb/Q287ApBX2G3xb2VwfVQAcm0rggAGmL4bEoFJTCQrQCAGh+fp9j7aUYBxWFzZf4Ok3jDCvBVuh0yA==',key_name='tempest-TestNetworkBasicOps-1885837558',keypairs=<?>,launch_index=0,launched_at=2025-10-08T19:05:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-2r2x09q7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T19:05:59Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=533c431a-8ae8-4310-81dc-29285b78f93c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "82f4743a-dcdc-49f7-be61-94d565e29842", "address": "fa:16:3e:2e:6b:6c", "network": {"id": "a913b285-6d0a-478e-aa24-18bb458d8f7a", "bridge": "br-int", "label": "tempest-network-smoke--994528417", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82f4743a-dc", "ovs_interfaceid": "82f4743a-dcdc-49f7-be61-94d565e29842", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Oct 08 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.684 2 DEBUG nova.network.os_vif_util [None req-41a5d4ad-2c21-407b-b153-89b5f2fa7d6a efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "82f4743a-dcdc-49f7-be61-94d565e29842", "address": "fa:16:3e:2e:6b:6c", "network": {"id": "a913b285-6d0a-478e-aa24-18bb458d8f7a", "bridge": "br-int", "label": "tempest-network-smoke--994528417", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82f4743a-dc", "ovs_interfaceid": "82f4743a-dcdc-49f7-be61-94d565e29842", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 08 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.685 2 DEBUG nova.network.os_vif_util [None req-41a5d4ad-2c21-407b-b153-89b5f2fa7d6a efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2e:6b:6c,bridge_name='br-int',has_traffic_filtering=True,id=82f4743a-dcdc-49f7-be61-94d565e29842,network=Network(a913b285-6d0a-478e-aa24-18bb458d8f7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82f4743a-dc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 08 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.686 2 DEBUG os_vif [None req-41a5d4ad-2c21-407b-b153-89b5f2fa7d6a efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:2e:6b:6c,bridge_name='br-int',has_traffic_filtering=True,id=82f4743a-dcdc-49f7-be61-94d565e29842,network=Network(a913b285-6d0a-478e-aa24-18bb458d8f7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82f4743a-dc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Oct 08 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.689 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.689 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap82f4743a-dc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.692 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.698 2 INFO os_vif [None req-41a5d4ad-2c21-407b-b153-89b5f2fa7d6a efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:2e:6b:6c,bridge_name='br-int',has_traffic_filtering=True,id=82f4743a-dcdc-49f7-be61-94d565e29842,network=Network(a913b285-6d0a-478e-aa24-18bb458d8f7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82f4743a-dc')
Oct 08 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.700 2 INFO nova.virt.libvirt.driver [None req-41a5d4ad-2c21-407b-b153-89b5f2fa7d6a efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Deleting instance files /var/lib/nova/instances/533c431a-8ae8-4310-81dc-29285b78f93c_del
Oct 08 19:07:03 compute-0 podman[145472]: 2025-10-08 19:07:03.701084267 +0000 UTC m=+0.131480624 container cleanup 3efae6d2598078a444d1d0b5df7fb7ce2c474b330f618d9a2595c2a8e415d9fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-a913b285-6d0a-478e-aa24-18bb458d8f7a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct 08 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.701 2 INFO nova.virt.libvirt.driver [None req-41a5d4ad-2c21-407b-b153-89b5f2fa7d6a efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Deletion of /var/lib/nova/instances/533c431a-8ae8-4310-81dc-29285b78f93c_del complete
Oct 08 19:07:03 compute-0 systemd[1]: libpod-conmon-3efae6d2598078a444d1d0b5df7fb7ce2c474b330f618d9a2595c2a8e415d9fa.scope: Deactivated successfully.
Oct 08 19:07:03 compute-0 podman[145519]: 2025-10-08 19:07:03.777433078 +0000 UTC m=+0.050079938 container remove 3efae6d2598078a444d1d0b5df7fb7ce2c474b330f618d9a2595c2a8e415d9fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-a913b285-6d0a-478e-aa24-18bb458d8f7a, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct 08 19:07:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:03.786 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[e0be4785-02d3-4ffe-8626-cb143600038c]: (4, ('Wed Oct  8 07:07:03 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-a913b285-6d0a-478e-aa24-18bb458d8f7a (3efae6d2598078a444d1d0b5df7fb7ce2c474b330f618d9a2595c2a8e415d9fa)\n3efae6d2598078a444d1d0b5df7fb7ce2c474b330f618d9a2595c2a8e415d9fa\nWed Oct  8 07:07:03 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-a913b285-6d0a-478e-aa24-18bb458d8f7a (3efae6d2598078a444d1d0b5df7fb7ce2c474b330f618d9a2595c2a8e415d9fa)\n3efae6d2598078a444d1d0b5df7fb7ce2c474b330f618d9a2595c2a8e415d9fa\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:07:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:03.788 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[e1db45f3-209c-4d26-8575-3344ddeb145c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:07:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:03.790 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa913b285-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:07:03 compute-0 kernel: tapa913b285-60: left promiscuous mode
Oct 08 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.793 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:07:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:03.799 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[46bf249b-3287-473b-8361-30aafa17f97e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.817 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:07:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:03.847 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[2aaf48f8-2f4d-40b5-9054-976348a76769]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:07:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:03.849 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[578be7e4-7d3a-4aac-82a6-22a4ab2c81f0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.850 2 INFO nova.compute.manager [None req-41a5d4ad-2c21-407b-b153-89b5f2fa7d6a efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Took 0.47 seconds to destroy the instance on the hypervisor.
Oct 08 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.851 2 DEBUG oslo.service.loopingcall [None req-41a5d4ad-2c21-407b-b153-89b5f2fa7d6a efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Oct 08 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.853 2 DEBUG nova.compute.manager [-] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Oct 08 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.853 2 DEBUG nova.network.neutron [-] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Oct 08 19:07:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:03.872 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[36c6e66d-58de-4837-a539-6b1db93c5565]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 103214, 'reachable_time': 24386, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 145535, 'error': None, 'target': 'ovnmeta-a913b285-6d0a-478e-aa24-18bb458d8f7a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:07:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:03.874 28783 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a913b285-6d0a-478e-aa24-18bb458d8f7a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Oct 08 19:07:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:03.874 28783 DEBUG oslo.privsep.daemon [-] privsep: reply[d822941a-7283-46f1-b916-71bee9e53e34]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:07:03 compute-0 systemd[1]: run-netns-ovnmeta\x2da913b285\x2d6d0a\x2d478e\x2daa24\x2d18bb458d8f7a.mount: Deactivated successfully.
Oct 08 19:07:04 compute-0 nova_compute[117514]: 2025-10-08 19:07:04.358 2 DEBUG nova.network.neutron [-] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:07:04 compute-0 nova_compute[117514]: 2025-10-08 19:07:04.380 2 INFO nova.compute.manager [-] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Took 0.53 seconds to deallocate network for instance.
Oct 08 19:07:04 compute-0 nova_compute[117514]: 2025-10-08 19:07:04.439 2 DEBUG oslo_concurrency.lockutils [None req-41a5d4ad-2c21-407b-b153-89b5f2fa7d6a efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:07:04 compute-0 nova_compute[117514]: 2025-10-08 19:07:04.440 2 DEBUG oslo_concurrency.lockutils [None req-41a5d4ad-2c21-407b-b153-89b5f2fa7d6a efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:07:04 compute-0 nova_compute[117514]: 2025-10-08 19:07:04.513 2 DEBUG nova.compute.provider_tree [None req-41a5d4ad-2c21-407b-b153-89b5f2fa7d6a efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 08 19:07:04 compute-0 nova_compute[117514]: 2025-10-08 19:07:04.529 2 DEBUG nova.scheduler.client.report [None req-41a5d4ad-2c21-407b-b153-89b5f2fa7d6a efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 08 19:07:04 compute-0 nova_compute[117514]: 2025-10-08 19:07:04.548 2 DEBUG oslo_concurrency.lockutils [None req-41a5d4ad-2c21-407b-b153-89b5f2fa7d6a efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:07:04 compute-0 nova_compute[117514]: 2025-10-08 19:07:04.575 2 INFO nova.scheduler.client.report [None req-41a5d4ad-2c21-407b-b153-89b5f2fa7d6a efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Deleted allocations for instance 533c431a-8ae8-4310-81dc-29285b78f93c
Oct 08 19:07:04 compute-0 nova_compute[117514]: 2025-10-08 19:07:04.651 2 DEBUG oslo_concurrency.lockutils [None req-41a5d4ad-2c21-407b-b153-89b5f2fa7d6a efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "533c431a-8ae8-4310-81dc-29285b78f93c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.277s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.060 2 DEBUG nova.network.neutron [req-70979864-b68c-4cf6-a572-e745794d451f req-acb49dd7-77f0-4780-b68c-54f5ba7035f8 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Updated VIF entry in instance network info cache for port 82f4743a-dcdc-49f7-be61-94d565e29842. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 08 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.061 2 DEBUG nova.network.neutron [req-70979864-b68c-4cf6-a572-e745794d451f req-acb49dd7-77f0-4780-b68c-54f5ba7035f8 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Updating instance_info_cache with network_info: [{"id": "82f4743a-dcdc-49f7-be61-94d565e29842", "address": "fa:16:3e:2e:6b:6c", "network": {"id": "a913b285-6d0a-478e-aa24-18bb458d8f7a", "bridge": "br-int", "label": "tempest-network-smoke--994528417", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82f4743a-dc", "ovs_interfaceid": "82f4743a-dcdc-49f7-be61-94d565e29842", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.079 2 DEBUG oslo_concurrency.lockutils [req-70979864-b68c-4cf6-a572-e745794d451f req-acb49dd7-77f0-4780-b68c-54f5ba7035f8 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-533c431a-8ae8-4310-81dc-29285b78f93c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.574 2 DEBUG nova.compute.manager [req-637c35c1-aa0a-4b96-a5c3-ce3c5fb0defb req-a0487a94-3dd2-4483-bd01-9e7afc7906ee bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Received event network-vif-unplugged-82f4743a-dcdc-49f7-be61-94d565e29842 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.575 2 DEBUG oslo_concurrency.lockutils [req-637c35c1-aa0a-4b96-a5c3-ce3c5fb0defb req-a0487a94-3dd2-4483-bd01-9e7afc7906ee bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "533c431a-8ae8-4310-81dc-29285b78f93c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.575 2 DEBUG oslo_concurrency.lockutils [req-637c35c1-aa0a-4b96-a5c3-ce3c5fb0defb req-a0487a94-3dd2-4483-bd01-9e7afc7906ee bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "533c431a-8ae8-4310-81dc-29285b78f93c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.576 2 DEBUG oslo_concurrency.lockutils [req-637c35c1-aa0a-4b96-a5c3-ce3c5fb0defb req-a0487a94-3dd2-4483-bd01-9e7afc7906ee bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "533c431a-8ae8-4310-81dc-29285b78f93c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.576 2 DEBUG nova.compute.manager [req-637c35c1-aa0a-4b96-a5c3-ce3c5fb0defb req-a0487a94-3dd2-4483-bd01-9e7afc7906ee bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] No waiting events found dispatching network-vif-unplugged-82f4743a-dcdc-49f7-be61-94d565e29842 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 08 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.577 2 WARNING nova.compute.manager [req-637c35c1-aa0a-4b96-a5c3-ce3c5fb0defb req-a0487a94-3dd2-4483-bd01-9e7afc7906ee bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Received unexpected event network-vif-unplugged-82f4743a-dcdc-49f7-be61-94d565e29842 for instance with vm_state deleted and task_state None.
Oct 08 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.577 2 DEBUG nova.compute.manager [req-637c35c1-aa0a-4b96-a5c3-ce3c5fb0defb req-a0487a94-3dd2-4483-bd01-9e7afc7906ee bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Received event network-vif-plugged-82f4743a-dcdc-49f7-be61-94d565e29842 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.578 2 DEBUG oslo_concurrency.lockutils [req-637c35c1-aa0a-4b96-a5c3-ce3c5fb0defb req-a0487a94-3dd2-4483-bd01-9e7afc7906ee bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "533c431a-8ae8-4310-81dc-29285b78f93c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.579 2 DEBUG oslo_concurrency.lockutils [req-637c35c1-aa0a-4b96-a5c3-ce3c5fb0defb req-a0487a94-3dd2-4483-bd01-9e7afc7906ee bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "533c431a-8ae8-4310-81dc-29285b78f93c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.579 2 DEBUG oslo_concurrency.lockutils [req-637c35c1-aa0a-4b96-a5c3-ce3c5fb0defb req-a0487a94-3dd2-4483-bd01-9e7afc7906ee bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "533c431a-8ae8-4310-81dc-29285b78f93c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.580 2 DEBUG nova.compute.manager [req-637c35c1-aa0a-4b96-a5c3-ce3c5fb0defb req-a0487a94-3dd2-4483-bd01-9e7afc7906ee bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] No waiting events found dispatching network-vif-plugged-82f4743a-dcdc-49f7-be61-94d565e29842 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 08 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.580 2 WARNING nova.compute.manager [req-637c35c1-aa0a-4b96-a5c3-ce3c5fb0defb req-a0487a94-3dd2-4483-bd01-9e7afc7906ee bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Received unexpected event network-vif-plugged-82f4743a-dcdc-49f7-be61-94d565e29842 for instance with vm_state deleted and task_state None.
Oct 08 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.581 2 DEBUG nova.compute.manager [req-637c35c1-aa0a-4b96-a5c3-ce3c5fb0defb req-a0487a94-3dd2-4483-bd01-9e7afc7906ee bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Received event network-vif-deleted-82f4743a-dcdc-49f7-be61-94d565e29842 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.958 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.959 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.974 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.974 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 08 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.974 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 08 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.987 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 08 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.987 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.987 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.987 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.988 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.988 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.988 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 08 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.989 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:07:06 compute-0 nova_compute[117514]: 2025-10-08 19:07:06.008 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:07:06 compute-0 nova_compute[117514]: 2025-10-08 19:07:06.008 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:07:06 compute-0 nova_compute[117514]: 2025-10-08 19:07:06.008 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:07:06 compute-0 nova_compute[117514]: 2025-10-08 19:07:06.009 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 08 19:07:06 compute-0 podman[145537]: 2025-10-08 19:07:06.149233853 +0000 UTC m=+0.083483947 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 08 19:07:06 compute-0 nova_compute[117514]: 2025-10-08 19:07:06.212 2 WARNING nova.virt.libvirt.driver [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 08 19:07:06 compute-0 nova_compute[117514]: 2025-10-08 19:07:06.213 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6103MB free_disk=73.42378616333008GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 08 19:07:06 compute-0 nova_compute[117514]: 2025-10-08 19:07:06.213 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:07:06 compute-0 nova_compute[117514]: 2025-10-08 19:07:06.213 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:07:06 compute-0 nova_compute[117514]: 2025-10-08 19:07:06.285 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 08 19:07:06 compute-0 nova_compute[117514]: 2025-10-08 19:07:06.285 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 08 19:07:06 compute-0 nova_compute[117514]: 2025-10-08 19:07:06.322 2 DEBUG nova.compute.provider_tree [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 08 19:07:06 compute-0 nova_compute[117514]: 2025-10-08 19:07:06.335 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 08 19:07:06 compute-0 nova_compute[117514]: 2025-10-08 19:07:06.357 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 08 19:07:06 compute-0 nova_compute[117514]: 2025-10-08 19:07:06.358 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:07:07 compute-0 nova_compute[117514]: 2025-10-08 19:07:07.088 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:07:07 compute-0 nova_compute[117514]: 2025-10-08 19:07:07.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:07:08 compute-0 nova_compute[117514]: 2025-10-08 19:07:08.008 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:07:08 compute-0 nova_compute[117514]: 2025-10-08 19:07:08.692 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:07:10 compute-0 nova_compute[117514]: 2025-10-08 19:07:10.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:07:13 compute-0 nova_compute[117514]: 2025-10-08 19:07:13.626 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759950418.625193, 432f298f-78dd-4e9e-9ee4-279c2bc544c1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 08 19:07:13 compute-0 nova_compute[117514]: 2025-10-08 19:07:13.627 2 INFO nova.compute.manager [-] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] VM Stopped (Lifecycle Event)
Oct 08 19:07:13 compute-0 nova_compute[117514]: 2025-10-08 19:07:13.652 2 DEBUG nova.compute.manager [None req-eebc01ce-a890-4c54-96c8-1687ed263f54 - - - - - -] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:07:13 compute-0 nova_compute[117514]: 2025-10-08 19:07:13.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:07:15 compute-0 nova_compute[117514]: 2025-10-08 19:07:15.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:07:15 compute-0 podman[145562]: 2025-10-08 19:07:15.667541694 +0000 UTC m=+0.092552507 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm)
Oct 08 19:07:18 compute-0 nova_compute[117514]: 2025-10-08 19:07:18.657 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759950423.6562872, 533c431a-8ae8-4310-81dc-29285b78f93c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 08 19:07:18 compute-0 nova_compute[117514]: 2025-10-08 19:07:18.657 2 INFO nova.compute.manager [-] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] VM Stopped (Lifecycle Event)
Oct 08 19:07:18 compute-0 nova_compute[117514]: 2025-10-08 19:07:18.676 2 DEBUG nova.compute.manager [None req-a69db70a-3252-46f5-84dd-d499a8296a62 - - - - - -] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:07:18 compute-0 nova_compute[117514]: 2025-10-08 19:07:18.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:07:20 compute-0 nova_compute[117514]: 2025-10-08 19:07:20.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:07:21 compute-0 podman[145582]: 2025-10-08 19:07:21.65709319 +0000 UTC m=+0.072425360 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, config_id=edpm, name=ubi9-minimal, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, maintainer=Red Hat, Inc., distribution-scope=public, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, architecture=x86_64, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers)
Oct 08 19:07:21 compute-0 podman[145583]: 2025-10-08 19:07:21.660958201 +0000 UTC m=+0.071259117 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct 08 19:07:22 compute-0 nova_compute[117514]: 2025-10-08 19:07:22.357 2 DEBUG oslo_concurrency.lockutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:07:22 compute-0 nova_compute[117514]: 2025-10-08 19:07:22.358 2 DEBUG oslo_concurrency.lockutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:07:22 compute-0 nova_compute[117514]: 2025-10-08 19:07:22.382 2 DEBUG nova.compute.manager [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Oct 08 19:07:22 compute-0 nova_compute[117514]: 2025-10-08 19:07:22.498 2 DEBUG oslo_concurrency.lockutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:07:22 compute-0 nova_compute[117514]: 2025-10-08 19:07:22.498 2 DEBUG oslo_concurrency.lockutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:07:22 compute-0 nova_compute[117514]: 2025-10-08 19:07:22.508 2 DEBUG nova.virt.hardware [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Oct 08 19:07:22 compute-0 nova_compute[117514]: 2025-10-08 19:07:22.508 2 INFO nova.compute.claims [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Claim successful on node compute-0.ctlplane.example.com
Oct 08 19:07:22 compute-0 nova_compute[117514]: 2025-10-08 19:07:22.629 2 DEBUG nova.compute.provider_tree [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 08 19:07:22 compute-0 nova_compute[117514]: 2025-10-08 19:07:22.647 2 DEBUG nova.scheduler.client.report [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 08 19:07:22 compute-0 nova_compute[117514]: 2025-10-08 19:07:22.672 2 DEBUG oslo_concurrency.lockutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.174s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:07:22 compute-0 nova_compute[117514]: 2025-10-08 19:07:22.673 2 DEBUG nova.compute.manager [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Oct 08 19:07:22 compute-0 nova_compute[117514]: 2025-10-08 19:07:22.713 2 DEBUG nova.compute.manager [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Oct 08 19:07:22 compute-0 nova_compute[117514]: 2025-10-08 19:07:22.714 2 DEBUG nova.network.neutron [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Oct 08 19:07:22 compute-0 nova_compute[117514]: 2025-10-08 19:07:22.731 2 INFO nova.virt.libvirt.driver [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 08 19:07:22 compute-0 nova_compute[117514]: 2025-10-08 19:07:22.750 2 DEBUG nova.compute.manager [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Oct 08 19:07:22 compute-0 nova_compute[117514]: 2025-10-08 19:07:22.829 2 DEBUG nova.compute.manager [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Oct 08 19:07:22 compute-0 nova_compute[117514]: 2025-10-08 19:07:22.832 2 DEBUG nova.virt.libvirt.driver [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Oct 08 19:07:22 compute-0 nova_compute[117514]: 2025-10-08 19:07:22.833 2 INFO nova.virt.libvirt.driver [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Creating image(s)
Oct 08 19:07:22 compute-0 nova_compute[117514]: 2025-10-08 19:07:22.834 2 DEBUG oslo_concurrency.lockutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "/var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:07:22 compute-0 nova_compute[117514]: 2025-10-08 19:07:22.834 2 DEBUG oslo_concurrency.lockutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "/var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:07:22 compute-0 nova_compute[117514]: 2025-10-08 19:07:22.836 2 DEBUG oslo_concurrency.lockutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "/var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:07:22 compute-0 nova_compute[117514]: 2025-10-08 19:07:22.857 2 DEBUG nova.policy [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Oct 08 19:07:22 compute-0 nova_compute[117514]: 2025-10-08 19:07:22.859 2 DEBUG oslo_concurrency.processutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:07:22 compute-0 nova_compute[117514]: 2025-10-08 19:07:22.922 2 DEBUG oslo_concurrency.processutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:07:22 compute-0 nova_compute[117514]: 2025-10-08 19:07:22.924 2 DEBUG oslo_concurrency.lockutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "008eb3078b811ee47058b7252a820910c35fc6df" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:07:22 compute-0 nova_compute[117514]: 2025-10-08 19:07:22.925 2 DEBUG oslo_concurrency.lockutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:07:22 compute-0 nova_compute[117514]: 2025-10-08 19:07:22.951 2 DEBUG oslo_concurrency.processutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:07:23 compute-0 nova_compute[117514]: 2025-10-08 19:07:23.010 2 DEBUG oslo_concurrency.processutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:07:23 compute-0 nova_compute[117514]: 2025-10-08 19:07:23.012 2 DEBUG oslo_concurrency.processutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df,backing_fmt=raw /var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:07:23 compute-0 nova_compute[117514]: 2025-10-08 19:07:23.177 2 DEBUG oslo_concurrency.processutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df,backing_fmt=raw /var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/disk 1073741824" returned: 0 in 0.166s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:07:23 compute-0 nova_compute[117514]: 2025-10-08 19:07:23.179 2 DEBUG oslo_concurrency.lockutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.254s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:07:23 compute-0 nova_compute[117514]: 2025-10-08 19:07:23.180 2 DEBUG oslo_concurrency.processutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:07:23 compute-0 nova_compute[117514]: 2025-10-08 19:07:23.252 2 DEBUG oslo_concurrency.processutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:07:23 compute-0 nova_compute[117514]: 2025-10-08 19:07:23.253 2 DEBUG nova.virt.disk.api [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Checking if we can resize image /var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Oct 08 19:07:23 compute-0 nova_compute[117514]: 2025-10-08 19:07:23.254 2 DEBUG oslo_concurrency.processutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:07:23 compute-0 nova_compute[117514]: 2025-10-08 19:07:23.309 2 DEBUG oslo_concurrency.processutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:07:23 compute-0 nova_compute[117514]: 2025-10-08 19:07:23.310 2 DEBUG nova.virt.disk.api [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Cannot resize image /var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Oct 08 19:07:23 compute-0 nova_compute[117514]: 2025-10-08 19:07:23.311 2 DEBUG nova.objects.instance [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'migration_context' on Instance uuid b66b330b-1cad-4dfb-a2f9-83201dc8ee32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 08 19:07:23 compute-0 nova_compute[117514]: 2025-10-08 19:07:23.325 2 DEBUG nova.virt.libvirt.driver [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Oct 08 19:07:23 compute-0 nova_compute[117514]: 2025-10-08 19:07:23.325 2 DEBUG nova.virt.libvirt.driver [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Ensure instance console log exists: /var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Oct 08 19:07:23 compute-0 nova_compute[117514]: 2025-10-08 19:07:23.326 2 DEBUG oslo_concurrency.lockutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:07:23 compute-0 nova_compute[117514]: 2025-10-08 19:07:23.326 2 DEBUG oslo_concurrency.lockutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:07:23 compute-0 nova_compute[117514]: 2025-10-08 19:07:23.327 2 DEBUG oslo_concurrency.lockutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:07:23 compute-0 nova_compute[117514]: 2025-10-08 19:07:23.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:07:25 compute-0 nova_compute[117514]: 2025-10-08 19:07:25.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:07:25 compute-0 podman[145638]: 2025-10-08 19:07:25.652107867 +0000 UTC m=+0.074556170 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct 08 19:07:26 compute-0 nova_compute[117514]: 2025-10-08 19:07:26.182 2 DEBUG nova.network.neutron [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Successfully created port: 0107be0e-1b4b-47dd-9422-a435ded0964c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Oct 08 19:07:27 compute-0 podman[145664]: 2025-10-08 19:07:27.65660042 +0000 UTC m=+0.059593041 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_metadata_agent)
Oct 08 19:07:27 compute-0 podman[145663]: 2025-10-08 19:07:27.697953387 +0000 UTC m=+0.076627580 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=iscsid, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid)
Oct 08 19:07:28 compute-0 nova_compute[117514]: 2025-10-08 19:07:28.021 2 DEBUG nova.network.neutron [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Successfully updated port: 0107be0e-1b4b-47dd-9422-a435ded0964c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Oct 08 19:07:28 compute-0 nova_compute[117514]: 2025-10-08 19:07:28.039 2 DEBUG oslo_concurrency.lockutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "refresh_cache-b66b330b-1cad-4dfb-a2f9-83201dc8ee32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:07:28 compute-0 nova_compute[117514]: 2025-10-08 19:07:28.039 2 DEBUG oslo_concurrency.lockutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquired lock "refresh_cache-b66b330b-1cad-4dfb-a2f9-83201dc8ee32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:07:28 compute-0 nova_compute[117514]: 2025-10-08 19:07:28.039 2 DEBUG nova.network.neutron [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 08 19:07:28 compute-0 nova_compute[117514]: 2025-10-08 19:07:28.120 2 DEBUG nova.compute.manager [req-d576261a-f88c-4a06-b12e-1473a2dddcaa req-d8d75571-6be0-49a7-afb8-bcf1c167c4d7 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Received event network-changed-0107be0e-1b4b-47dd-9422-a435ded0964c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:07:28 compute-0 nova_compute[117514]: 2025-10-08 19:07:28.121 2 DEBUG nova.compute.manager [req-d576261a-f88c-4a06-b12e-1473a2dddcaa req-d8d75571-6be0-49a7-afb8-bcf1c167c4d7 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Refreshing instance network info cache due to event network-changed-0107be0e-1b4b-47dd-9422-a435ded0964c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 08 19:07:28 compute-0 nova_compute[117514]: 2025-10-08 19:07:28.121 2 DEBUG oslo_concurrency.lockutils [req-d576261a-f88c-4a06-b12e-1473a2dddcaa req-d8d75571-6be0-49a7-afb8-bcf1c167c4d7 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-b66b330b-1cad-4dfb-a2f9-83201dc8ee32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:07:28 compute-0 nova_compute[117514]: 2025-10-08 19:07:28.196 2 DEBUG nova.network.neutron [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Oct 08 19:07:28 compute-0 nova_compute[117514]: 2025-10-08 19:07:28.791 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.265 2 DEBUG nova.network.neutron [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Updating instance_info_cache with network_info: [{"id": "0107be0e-1b4b-47dd-9422-a435ded0964c", "address": "fa:16:3e:d7:63:9d", "network": {"id": "15690acb-54cf-4081-a718-c14a1c0af6a8", "bridge": "br-int", "label": "tempest-network-smoke--977169033", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0107be0e-1b", "ovs_interfaceid": "0107be0e-1b4b-47dd-9422-a435ded0964c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.290 2 DEBUG oslo_concurrency.lockutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Releasing lock "refresh_cache-b66b330b-1cad-4dfb-a2f9-83201dc8ee32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.291 2 DEBUG nova.compute.manager [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Instance network_info: |[{"id": "0107be0e-1b4b-47dd-9422-a435ded0964c", "address": "fa:16:3e:d7:63:9d", "network": {"id": "15690acb-54cf-4081-a718-c14a1c0af6a8", "bridge": "br-int", "label": "tempest-network-smoke--977169033", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0107be0e-1b", "ovs_interfaceid": "0107be0e-1b4b-47dd-9422-a435ded0964c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Oct 08 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.291 2 DEBUG oslo_concurrency.lockutils [req-d576261a-f88c-4a06-b12e-1473a2dddcaa req-d8d75571-6be0-49a7-afb8-bcf1c167c4d7 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-b66b330b-1cad-4dfb-a2f9-83201dc8ee32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.291 2 DEBUG nova.network.neutron [req-d576261a-f88c-4a06-b12e-1473a2dddcaa req-d8d75571-6be0-49a7-afb8-bcf1c167c4d7 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Refreshing network info cache for port 0107be0e-1b4b-47dd-9422-a435ded0964c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 08 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.295 2 DEBUG nova.virt.libvirt.driver [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Start _get_guest_xml network_info=[{"id": "0107be0e-1b4b-47dd-9422-a435ded0964c", "address": "fa:16:3e:d7:63:9d", "network": {"id": "15690acb-54cf-4081-a718-c14a1c0af6a8", "bridge": "br-int", "label": "tempest-network-smoke--977169033", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0107be0e-1b", "ovs_interfaceid": "0107be0e-1b4b-47dd-9422-a435ded0964c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T19:05:11Z,direct_url=<?>,disk_format='qcow2',id=23cfa426-7011-4566-992d-1c7af39f70dd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0776a2a010754884a7b224f3b08ef53b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T19:05:12Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'guest_format': None, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_options': None, 'image_id': '23cfa426-7011-4566-992d-1c7af39f70dd'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Oct 08 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.299 2 WARNING nova.virt.libvirt.driver [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 08 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.304 2 DEBUG nova.virt.libvirt.host [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Oct 08 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.305 2 DEBUG nova.virt.libvirt.host [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Oct 08 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.311 2 DEBUG nova.virt.libvirt.host [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Oct 08 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.312 2 DEBUG nova.virt.libvirt.host [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Oct 08 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.312 2 DEBUG nova.virt.libvirt.driver [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Oct 08 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.312 2 DEBUG nova.virt.hardware [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T19:05:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='e8a148fc-4419-4813-98ff-a17e2a95609e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T19:05:11Z,direct_url=<?>,disk_format='qcow2',id=23cfa426-7011-4566-992d-1c7af39f70dd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0776a2a010754884a7b224f3b08ef53b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T19:05:12Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Oct 08 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.313 2 DEBUG nova.virt.hardware [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Oct 08 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.313 2 DEBUG nova.virt.hardware [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Oct 08 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.314 2 DEBUG nova.virt.hardware [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Oct 08 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.314 2 DEBUG nova.virt.hardware [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Oct 08 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.314 2 DEBUG nova.virt.hardware [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Oct 08 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.315 2 DEBUG nova.virt.hardware [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Oct 08 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.315 2 DEBUG nova.virt.hardware [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Oct 08 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.315 2 DEBUG nova.virt.hardware [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Oct 08 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.315 2 DEBUG nova.virt.hardware [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Oct 08 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.316 2 DEBUG nova.virt.hardware [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Oct 08 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.319 2 DEBUG nova.virt.libvirt.vif [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T19:07:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-602516393',display_name='tempest-TestNetworkBasicOps-server-602516393',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-602516393',id=3,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC1aTvBPkgf1VfNjvC4uuKKg+tISnXImijmvs2cGp+FtgeRrvxYh9lBxLRU9xSzH0Z6LaCabBaf6NwgK+eU8uEwumcvsX4qsd2EcbV6VjIknh+8LBbcMTdQeQSSFJx6qhQ==',key_name='tempest-TestNetworkBasicOps-1029193278',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-dwebwbaf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T19:07:22Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=b66b330b-1cad-4dfb-a2f9-83201dc8ee32,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0107be0e-1b4b-47dd-9422-a435ded0964c", "address": "fa:16:3e:d7:63:9d", "network": {"id": "15690acb-54cf-4081-a718-c14a1c0af6a8", "bridge": "br-int", "label": "tempest-network-smoke--977169033", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0107be0e-1b", "ovs_interfaceid": "0107be0e-1b4b-47dd-9422-a435ded0964c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Oct 08 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.320 2 DEBUG nova.network.os_vif_util [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "0107be0e-1b4b-47dd-9422-a435ded0964c", "address": "fa:16:3e:d7:63:9d", "network": {"id": "15690acb-54cf-4081-a718-c14a1c0af6a8", "bridge": "br-int", "label": "tempest-network-smoke--977169033", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0107be0e-1b", "ovs_interfaceid": "0107be0e-1b4b-47dd-9422-a435ded0964c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 08 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.321 2 DEBUG nova.network.os_vif_util [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d7:63:9d,bridge_name='br-int',has_traffic_filtering=True,id=0107be0e-1b4b-47dd-9422-a435ded0964c,network=Network(15690acb-54cf-4081-a718-c14a1c0af6a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0107be0e-1b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 08 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.322 2 DEBUG nova.objects.instance [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'pci_devices' on Instance uuid b66b330b-1cad-4dfb-a2f9-83201dc8ee32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 08 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.335 2 DEBUG nova.virt.libvirt.driver [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] End _get_guest_xml xml=<domain type="kvm">
Oct 08 19:07:29 compute-0 nova_compute[117514]:   <uuid>b66b330b-1cad-4dfb-a2f9-83201dc8ee32</uuid>
Oct 08 19:07:29 compute-0 nova_compute[117514]:   <name>instance-00000003</name>
Oct 08 19:07:29 compute-0 nova_compute[117514]:   <memory>131072</memory>
Oct 08 19:07:29 compute-0 nova_compute[117514]:   <vcpu>1</vcpu>
Oct 08 19:07:29 compute-0 nova_compute[117514]:   <metadata>
Oct 08 19:07:29 compute-0 nova_compute[117514]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 08 19:07:29 compute-0 nova_compute[117514]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct 08 19:07:29 compute-0 nova_compute[117514]:       <nova:name>tempest-TestNetworkBasicOps-server-602516393</nova:name>
Oct 08 19:07:29 compute-0 nova_compute[117514]:       <nova:creationTime>2025-10-08 19:07:29</nova:creationTime>
Oct 08 19:07:29 compute-0 nova_compute[117514]:       <nova:flavor name="m1.nano">
Oct 08 19:07:29 compute-0 nova_compute[117514]:         <nova:memory>128</nova:memory>
Oct 08 19:07:29 compute-0 nova_compute[117514]:         <nova:disk>1</nova:disk>
Oct 08 19:07:29 compute-0 nova_compute[117514]:         <nova:swap>0</nova:swap>
Oct 08 19:07:29 compute-0 nova_compute[117514]:         <nova:ephemeral>0</nova:ephemeral>
Oct 08 19:07:29 compute-0 nova_compute[117514]:         <nova:vcpus>1</nova:vcpus>
Oct 08 19:07:29 compute-0 nova_compute[117514]:       </nova:flavor>
Oct 08 19:07:29 compute-0 nova_compute[117514]:       <nova:owner>
Oct 08 19:07:29 compute-0 nova_compute[117514]:         <nova:user uuid="efdb1424acdb478684cdb088b373ba05">tempest-TestNetworkBasicOps-1122149477-project-member</nova:user>
Oct 08 19:07:29 compute-0 nova_compute[117514]:         <nova:project uuid="b7f7c752a9c5498f8eda73e461895ac9">tempest-TestNetworkBasicOps-1122149477</nova:project>
Oct 08 19:07:29 compute-0 nova_compute[117514]:       </nova:owner>
Oct 08 19:07:29 compute-0 nova_compute[117514]:       <nova:root type="image" uuid="23cfa426-7011-4566-992d-1c7af39f70dd"/>
Oct 08 19:07:29 compute-0 nova_compute[117514]:       <nova:ports>
Oct 08 19:07:29 compute-0 nova_compute[117514]:         <nova:port uuid="0107be0e-1b4b-47dd-9422-a435ded0964c">
Oct 08 19:07:29 compute-0 nova_compute[117514]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct 08 19:07:29 compute-0 nova_compute[117514]:         </nova:port>
Oct 08 19:07:29 compute-0 nova_compute[117514]:       </nova:ports>
Oct 08 19:07:29 compute-0 nova_compute[117514]:     </nova:instance>
Oct 08 19:07:29 compute-0 nova_compute[117514]:   </metadata>
Oct 08 19:07:29 compute-0 nova_compute[117514]:   <sysinfo type="smbios">
Oct 08 19:07:29 compute-0 nova_compute[117514]:     <system>
Oct 08 19:07:29 compute-0 nova_compute[117514]:       <entry name="manufacturer">RDO</entry>
Oct 08 19:07:29 compute-0 nova_compute[117514]:       <entry name="product">OpenStack Compute</entry>
Oct 08 19:07:29 compute-0 nova_compute[117514]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct 08 19:07:29 compute-0 nova_compute[117514]:       <entry name="serial">b66b330b-1cad-4dfb-a2f9-83201dc8ee32</entry>
Oct 08 19:07:29 compute-0 nova_compute[117514]:       <entry name="uuid">b66b330b-1cad-4dfb-a2f9-83201dc8ee32</entry>
Oct 08 19:07:29 compute-0 nova_compute[117514]:       <entry name="family">Virtual Machine</entry>
Oct 08 19:07:29 compute-0 nova_compute[117514]:     </system>
Oct 08 19:07:29 compute-0 nova_compute[117514]:   </sysinfo>
Oct 08 19:07:29 compute-0 nova_compute[117514]:   <os>
Oct 08 19:07:29 compute-0 nova_compute[117514]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 08 19:07:29 compute-0 nova_compute[117514]:     <boot dev="hd"/>
Oct 08 19:07:29 compute-0 nova_compute[117514]:     <smbios mode="sysinfo"/>
Oct 08 19:07:29 compute-0 nova_compute[117514]:   </os>
Oct 08 19:07:29 compute-0 nova_compute[117514]:   <features>
Oct 08 19:07:29 compute-0 nova_compute[117514]:     <acpi/>
Oct 08 19:07:29 compute-0 nova_compute[117514]:     <apic/>
Oct 08 19:07:29 compute-0 nova_compute[117514]:     <vmcoreinfo/>
Oct 08 19:07:29 compute-0 nova_compute[117514]:   </features>
Oct 08 19:07:29 compute-0 nova_compute[117514]:   <clock offset="utc">
Oct 08 19:07:29 compute-0 nova_compute[117514]:     <timer name="pit" tickpolicy="delay"/>
Oct 08 19:07:29 compute-0 nova_compute[117514]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 08 19:07:29 compute-0 nova_compute[117514]:     <timer name="hpet" present="no"/>
Oct 08 19:07:29 compute-0 nova_compute[117514]:   </clock>
Oct 08 19:07:29 compute-0 nova_compute[117514]:   <cpu mode="host-model" match="exact">
Oct 08 19:07:29 compute-0 nova_compute[117514]:     <topology sockets="1" cores="1" threads="1"/>
Oct 08 19:07:29 compute-0 nova_compute[117514]:   </cpu>
Oct 08 19:07:29 compute-0 nova_compute[117514]:   <devices>
Oct 08 19:07:29 compute-0 nova_compute[117514]:     <disk type="file" device="disk">
Oct 08 19:07:29 compute-0 nova_compute[117514]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 08 19:07:29 compute-0 nova_compute[117514]:       <source file="/var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/disk"/>
Oct 08 19:07:29 compute-0 nova_compute[117514]:       <target dev="vda" bus="virtio"/>
Oct 08 19:07:29 compute-0 nova_compute[117514]:     </disk>
Oct 08 19:07:29 compute-0 nova_compute[117514]:     <disk type="file" device="cdrom">
Oct 08 19:07:29 compute-0 nova_compute[117514]:       <driver name="qemu" type="raw" cache="none"/>
Oct 08 19:07:29 compute-0 nova_compute[117514]:       <source file="/var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/disk.config"/>
Oct 08 19:07:29 compute-0 nova_compute[117514]:       <target dev="sda" bus="sata"/>
Oct 08 19:07:29 compute-0 nova_compute[117514]:     </disk>
Oct 08 19:07:29 compute-0 nova_compute[117514]:     <interface type="ethernet">
Oct 08 19:07:29 compute-0 nova_compute[117514]:       <mac address="fa:16:3e:d7:63:9d"/>
Oct 08 19:07:29 compute-0 nova_compute[117514]:       <model type="virtio"/>
Oct 08 19:07:29 compute-0 nova_compute[117514]:       <driver name="vhost" rx_queue_size="512"/>
Oct 08 19:07:29 compute-0 nova_compute[117514]:       <mtu size="1442"/>
Oct 08 19:07:29 compute-0 nova_compute[117514]:       <target dev="tap0107be0e-1b"/>
Oct 08 19:07:29 compute-0 nova_compute[117514]:     </interface>
Oct 08 19:07:29 compute-0 nova_compute[117514]:     <serial type="pty">
Oct 08 19:07:29 compute-0 nova_compute[117514]:       <log file="/var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/console.log" append="off"/>
Oct 08 19:07:29 compute-0 nova_compute[117514]:     </serial>
Oct 08 19:07:29 compute-0 nova_compute[117514]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 08 19:07:29 compute-0 nova_compute[117514]:     <video>
Oct 08 19:07:29 compute-0 nova_compute[117514]:       <model type="virtio"/>
Oct 08 19:07:29 compute-0 nova_compute[117514]:     </video>
Oct 08 19:07:29 compute-0 nova_compute[117514]:     <input type="tablet" bus="usb"/>
Oct 08 19:07:29 compute-0 nova_compute[117514]:     <rng model="virtio">
Oct 08 19:07:29 compute-0 nova_compute[117514]:       <backend model="random">/dev/urandom</backend>
Oct 08 19:07:29 compute-0 nova_compute[117514]:     </rng>
Oct 08 19:07:29 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root"/>
Oct 08 19:07:29 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:07:29 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:07:29 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:07:29 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:07:29 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:07:29 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:07:29 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:07:29 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:07:29 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:07:29 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:07:29 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:07:29 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:07:29 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:07:29 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:07:29 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:07:29 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:07:29 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:07:29 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:07:29 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:07:29 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:07:29 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:07:29 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:07:29 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:07:29 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:07:29 compute-0 nova_compute[117514]:     <controller type="usb" index="0"/>
Oct 08 19:07:29 compute-0 nova_compute[117514]:     <memballoon model="virtio">
Oct 08 19:07:29 compute-0 nova_compute[117514]:       <stats period="10"/>
Oct 08 19:07:29 compute-0 nova_compute[117514]:     </memballoon>
Oct 08 19:07:29 compute-0 nova_compute[117514]:   </devices>
Oct 08 19:07:29 compute-0 nova_compute[117514]: </domain>
Oct 08 19:07:29 compute-0 nova_compute[117514]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Oct 08 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.336 2 DEBUG nova.compute.manager [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Preparing to wait for external event network-vif-plugged-0107be0e-1b4b-47dd-9422-a435ded0964c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Oct 08 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.337 2 DEBUG oslo_concurrency.lockutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.337 2 DEBUG oslo_concurrency.lockutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.337 2 DEBUG oslo_concurrency.lockutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.338 2 DEBUG nova.virt.libvirt.vif [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T19:07:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-602516393',display_name='tempest-TestNetworkBasicOps-server-602516393',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-602516393',id=3,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC1aTvBPkgf1VfNjvC4uuKKg+tISnXImijmvs2cGp+FtgeRrvxYh9lBxLRU9xSzH0Z6LaCabBaf6NwgK+eU8uEwumcvsX4qsd2EcbV6VjIknh+8LBbcMTdQeQSSFJx6qhQ==',key_name='tempest-TestNetworkBasicOps-1029193278',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-dwebwbaf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T19:07:22Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=b66b330b-1cad-4dfb-a2f9-83201dc8ee32,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0107be0e-1b4b-47dd-9422-a435ded0964c", "address": "fa:16:3e:d7:63:9d", "network": {"id": "15690acb-54cf-4081-a718-c14a1c0af6a8", "bridge": "br-int", "label": "tempest-network-smoke--977169033", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0107be0e-1b", "ovs_interfaceid": "0107be0e-1b4b-47dd-9422-a435ded0964c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Oct 08 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.339 2 DEBUG nova.network.os_vif_util [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "0107be0e-1b4b-47dd-9422-a435ded0964c", "address": "fa:16:3e:d7:63:9d", "network": {"id": "15690acb-54cf-4081-a718-c14a1c0af6a8", "bridge": "br-int", "label": "tempest-network-smoke--977169033", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0107be0e-1b", "ovs_interfaceid": "0107be0e-1b4b-47dd-9422-a435ded0964c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 08 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.339 2 DEBUG nova.network.os_vif_util [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d7:63:9d,bridge_name='br-int',has_traffic_filtering=True,id=0107be0e-1b4b-47dd-9422-a435ded0964c,network=Network(15690acb-54cf-4081-a718-c14a1c0af6a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0107be0e-1b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 08 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.340 2 DEBUG os_vif [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d7:63:9d,bridge_name='br-int',has_traffic_filtering=True,id=0107be0e-1b4b-47dd-9422-a435ded0964c,network=Network(15690acb-54cf-4081-a718-c14a1c0af6a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0107be0e-1b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Oct 08 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.340 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.341 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.341 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 08 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.344 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0107be0e-1b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.344 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0107be0e-1b, col_values=(('external_ids', {'iface-id': '0107be0e-1b4b-47dd-9422-a435ded0964c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d7:63:9d', 'vm-uuid': 'b66b330b-1cad-4dfb-a2f9-83201dc8ee32'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.346 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:07:29 compute-0 NetworkManager[1035]: <info>  [1759950449.3480] manager: (tap0107be0e-1b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/33)
Oct 08 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.349 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 08 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.353 2 INFO os_vif [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d7:63:9d,bridge_name='br-int',has_traffic_filtering=True,id=0107be0e-1b4b-47dd-9422-a435ded0964c,network=Network(15690acb-54cf-4081-a718-c14a1c0af6a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0107be0e-1b')
Oct 08 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.485 2 DEBUG nova.virt.libvirt.driver [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 08 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.486 2 DEBUG nova.virt.libvirt.driver [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 08 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.487 2 DEBUG nova.virt.libvirt.driver [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No VIF found with MAC fa:16:3e:d7:63:9d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Oct 08 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.488 2 INFO nova.virt.libvirt.driver [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Using config drive
Oct 08 19:07:29 compute-0 podman[145706]: 2025-10-08 19:07:29.704552358 +0000 UTC m=+0.119535571 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct 08 19:07:30 compute-0 nova_compute[117514]: 2025-10-08 19:07:30.167 2 INFO nova.virt.libvirt.driver [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Creating config drive at /var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/disk.config
Oct 08 19:07:30 compute-0 nova_compute[117514]: 2025-10-08 19:07:30.176 2 DEBUG oslo_concurrency.processutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpa7w4x7v3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:07:30 compute-0 nova_compute[117514]: 2025-10-08 19:07:30.318 2 DEBUG oslo_concurrency.processutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpa7w4x7v3" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:07:30 compute-0 kernel: tap0107be0e-1b: entered promiscuous mode
Oct 08 19:07:30 compute-0 ovn_controller[19759]: 2025-10-08T19:07:30Z|00046|binding|INFO|Claiming lport 0107be0e-1b4b-47dd-9422-a435ded0964c for this chassis.
Oct 08 19:07:30 compute-0 ovn_controller[19759]: 2025-10-08T19:07:30Z|00047|binding|INFO|0107be0e-1b4b-47dd-9422-a435ded0964c: Claiming fa:16:3e:d7:63:9d 10.100.0.6
Oct 08 19:07:30 compute-0 NetworkManager[1035]: <info>  [1759950450.3994] manager: (tap0107be0e-1b): new Tun device (/org/freedesktop/NetworkManager/Devices/34)
Oct 08 19:07:30 compute-0 nova_compute[117514]: 2025-10-08 19:07:30.398 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:07:30 compute-0 nova_compute[117514]: 2025-10-08 19:07:30.406 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.421 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d7:63:9d 10.100.0.6'], port_security=['fa:16:3e:d7:63:9d 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'b66b330b-1cad-4dfb-a2f9-83201dc8ee32', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-15690acb-54cf-4081-a718-c14a1c0af6a8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '2', 'neutron:security_group_ids': '18c7314c-d74a-4643-933f-4dc6b05c33cc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e9980b68-53e4-4dfd-a3d6-cbcaebcf011d, chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=0107be0e-1b4b-47dd-9422-a435ded0964c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.424 28643 INFO neutron.agent.ovn.metadata.agent [-] Port 0107be0e-1b4b-47dd-9422-a435ded0964c in datapath 15690acb-54cf-4081-a718-c14a1c0af6a8 bound to our chassis
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.426 28643 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 15690acb-54cf-4081-a718-c14a1c0af6a8
Oct 08 19:07:30 compute-0 systemd-udevd[145750]: Network interface NamePolicy= disabled on kernel command line.
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.445 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[524d61a5-11f7-4b43-91be-b9c720ff22cf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.446 28643 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap15690acb-51 in ovnmeta-15690acb-54cf-4081-a718-c14a1c0af6a8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.448 144726 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap15690acb-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.448 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[3c63d004-7be7-4ce9-a57c-b3408a58dcef]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.449 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[db5da360-dd79-4076-91af-ad769c2861f7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:07:30 compute-0 NetworkManager[1035]: <info>  [1759950450.4557] device (tap0107be0e-1b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 08 19:07:30 compute-0 NetworkManager[1035]: <info>  [1759950450.4571] device (tap0107be0e-1b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 08 19:07:30 compute-0 systemd-machined[77568]: New machine qemu-3-instance-00000003.
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.462 28783 DEBUG oslo.privsep.daemon [-] privsep: reply[35d09d2a-17b6-46d5-8ede-bf0eff73ec88]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:07:30 compute-0 nova_compute[117514]: 2025-10-08 19:07:30.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:07:30 compute-0 systemd[1]: Started Virtual Machine qemu-3-instance-00000003.
Oct 08 19:07:30 compute-0 ovn_controller[19759]: 2025-10-08T19:07:30Z|00048|binding|INFO|Setting lport 0107be0e-1b4b-47dd-9422-a435ded0964c ovn-installed in OVS
Oct 08 19:07:30 compute-0 ovn_controller[19759]: 2025-10-08T19:07:30Z|00049|binding|INFO|Setting lport 0107be0e-1b4b-47dd-9422-a435ded0964c up in Southbound
Oct 08 19:07:30 compute-0 nova_compute[117514]: 2025-10-08 19:07:30.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.503 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[826fb329-891f-437a-96a0-190932833abb]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.538 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[fb4fa0e4-970c-4bd7-ab2e-704723b60344]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:07:30 compute-0 nova_compute[117514]: 2025-10-08 19:07:30.539 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.543 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[0b6056d3-ccb6-4c2a-8333-81644c7988f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:07:30 compute-0 NetworkManager[1035]: <info>  [1759950450.5452] manager: (tap15690acb-50): new Veth device (/org/freedesktop/NetworkManager/Devices/35)
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.577 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[5dddfa28-4455-4ba0-be34-78e46748ac82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.579 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[c87e521e-3655-46db-a688-bc39a846d476]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:07:30 compute-0 NetworkManager[1035]: <info>  [1759950450.6018] device (tap15690acb-50): carrier: link connected
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.607 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[02d0c148-3d15-46a2-9950-9f0ebbc6beb9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.629 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[e29dc5c4-50e9-4ac9-b64a-051e2b9aace8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap15690acb-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b8:13:95'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 112118, 'reachable_time': 42695, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 145783, 'error': None, 'target': 'ovnmeta-15690acb-54cf-4081-a718-c14a1c0af6a8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.653 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[4637b344-7ac6-415c-b405-0ad259d1a24a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb8:1395'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 112118, 'tstamp': 112118}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 145784, 'error': None, 'target': 'ovnmeta-15690acb-54cf-4081-a718-c14a1c0af6a8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.674 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[ae09ee8d-c651-43e1-84d9-cc23fcda1ef4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap15690acb-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b8:13:95'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 112118, 'reachable_time': 42695, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 145785, 'error': None, 'target': 'ovnmeta-15690acb-54cf-4081-a718-c14a1c0af6a8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.720 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[04c0eec0-442a-4fa7-b75f-61cf47c677f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.799 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[86b18965-93ef-4d98-8ad2-76c55f0e45e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.800 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap15690acb-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.800 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.800 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap15690acb-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:07:30 compute-0 kernel: tap15690acb-50: entered promiscuous mode
Oct 08 19:07:30 compute-0 NetworkManager[1035]: <info>  [1759950450.8032] manager: (tap15690acb-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.805 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap15690acb-50, col_values=(('external_ids', {'iface-id': 'b2172a75-691e-43ff-a242-3b06a5bfd197'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:07:30 compute-0 ovn_controller[19759]: 2025-10-08T19:07:30Z|00050|binding|INFO|Releasing lport b2172a75-691e-43ff-a242-3b06a5bfd197 from this chassis (sb_readonly=0)
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.808 28643 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/15690acb-54cf-4081-a718-c14a1c0af6a8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/15690acb-54cf-4081-a718-c14a1c0af6a8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.809 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[cb8caff9-c90d-430f-8d89-06be28316676]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.809 28643 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]: global
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]:     log         /dev/log local0 debug
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]:     log-tag     haproxy-metadata-proxy-15690acb-54cf-4081-a718-c14a1c0af6a8
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]:     user        root
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]:     group       root
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]:     maxconn     1024
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]:     pidfile     /var/lib/neutron/external/pids/15690acb-54cf-4081-a718-c14a1c0af6a8.pid.haproxy
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]:     daemon
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]: 
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]: defaults
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]:     log global
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]:     mode http
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]:     option httplog
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]:     option dontlognull
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]:     option http-server-close
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]:     option forwardfor
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]:     retries                 3
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]:     timeout http-request    30s
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]:     timeout connect         30s
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]:     timeout client          32s
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]:     timeout server          32s
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]:     timeout http-keep-alive 30s
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]: 
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]: 
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]: listen listener
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]:     bind 169.254.169.254:80
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]:     server metadata /var/lib/neutron/metadata_proxy
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]:     http-request add-header X-OVN-Network-ID 15690acb-54cf-4081-a718-c14a1c0af6a8
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Oct 08 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.810 28643 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-15690acb-54cf-4081-a718-c14a1c0af6a8', 'env', 'PROCESS_TAG=haproxy-15690acb-54cf-4081-a718-c14a1c0af6a8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/15690acb-54cf-4081-a718-c14a1c0af6a8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Oct 08 19:07:30 compute-0 nova_compute[117514]: 2025-10-08 19:07:30.802 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:07:31 compute-0 nova_compute[117514]: 2025-10-08 19:07:31.092 2 DEBUG nova.network.neutron [req-d576261a-f88c-4a06-b12e-1473a2dddcaa req-d8d75571-6be0-49a7-afb8-bcf1c167c4d7 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Updated VIF entry in instance network info cache for port 0107be0e-1b4b-47dd-9422-a435ded0964c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 08 19:07:31 compute-0 nova_compute[117514]: 2025-10-08 19:07:31.093 2 DEBUG nova.network.neutron [req-d576261a-f88c-4a06-b12e-1473a2dddcaa req-d8d75571-6be0-49a7-afb8-bcf1c167c4d7 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Updating instance_info_cache with network_info: [{"id": "0107be0e-1b4b-47dd-9422-a435ded0964c", "address": "fa:16:3e:d7:63:9d", "network": {"id": "15690acb-54cf-4081-a718-c14a1c0af6a8", "bridge": "br-int", "label": "tempest-network-smoke--977169033", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0107be0e-1b", "ovs_interfaceid": "0107be0e-1b4b-47dd-9422-a435ded0964c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:07:31 compute-0 nova_compute[117514]: 2025-10-08 19:07:31.128 2 DEBUG oslo_concurrency.lockutils [req-d576261a-f88c-4a06-b12e-1473a2dddcaa req-d8d75571-6be0-49a7-afb8-bcf1c167c4d7 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-b66b330b-1cad-4dfb-a2f9-83201dc8ee32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:07:31 compute-0 nova_compute[117514]: 2025-10-08 19:07:31.172 2 DEBUG nova.compute.manager [req-b259494c-4afb-4e90-8111-d49581857088 req-d2a7e073-7015-48a1-bb79-ebbfec6443c4 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Received event network-vif-plugged-0107be0e-1b4b-47dd-9422-a435ded0964c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:07:31 compute-0 nova_compute[117514]: 2025-10-08 19:07:31.173 2 DEBUG oslo_concurrency.lockutils [req-b259494c-4afb-4e90-8111-d49581857088 req-d2a7e073-7015-48a1-bb79-ebbfec6443c4 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:07:31 compute-0 nova_compute[117514]: 2025-10-08 19:07:31.173 2 DEBUG oslo_concurrency.lockutils [req-b259494c-4afb-4e90-8111-d49581857088 req-d2a7e073-7015-48a1-bb79-ebbfec6443c4 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:07:31 compute-0 nova_compute[117514]: 2025-10-08 19:07:31.173 2 DEBUG oslo_concurrency.lockutils [req-b259494c-4afb-4e90-8111-d49581857088 req-d2a7e073-7015-48a1-bb79-ebbfec6443c4 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:07:31 compute-0 nova_compute[117514]: 2025-10-08 19:07:31.173 2 DEBUG nova.compute.manager [req-b259494c-4afb-4e90-8111-d49581857088 req-d2a7e073-7015-48a1-bb79-ebbfec6443c4 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Processing event network-vif-plugged-0107be0e-1b4b-47dd-9422-a435ded0964c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Oct 08 19:07:31 compute-0 podman[145817]: 2025-10-08 19:07:31.150146162 +0000 UTC m=+0.020985601 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7
Oct 08 19:07:31 compute-0 podman[145817]: 2025-10-08 19:07:31.512272797 +0000 UTC m=+0.383112246 container create f1a0117811421542b95673dae027a361c998bc57c3bcbb56c41602c72d45d71c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-15690acb-54cf-4081-a718-c14a1c0af6a8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 08 19:07:31 compute-0 systemd[1]: Started libpod-conmon-f1a0117811421542b95673dae027a361c998bc57c3bcbb56c41602c72d45d71c.scope.
Oct 08 19:07:31 compute-0 systemd[1]: Started libcrun container.
Oct 08 19:07:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3dc6caed12c224c239b697cd06381493a291f904c3b4b3172b2f62f362bdce12/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 08 19:07:31 compute-0 podman[145817]: 2025-10-08 19:07:31.822428882 +0000 UTC m=+0.693268401 container init f1a0117811421542b95673dae027a361c998bc57c3bcbb56c41602c72d45d71c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-15690acb-54cf-4081-a718-c14a1c0af6a8, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct 08 19:07:31 compute-0 podman[145817]: 2025-10-08 19:07:31.831587197 +0000 UTC m=+0.702426656 container start f1a0117811421542b95673dae027a361c998bc57c3bcbb56c41602c72d45d71c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-15690acb-54cf-4081-a718-c14a1c0af6a8, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct 08 19:07:31 compute-0 neutron-haproxy-ovnmeta-15690acb-54cf-4081-a718-c14a1c0af6a8[145840]: [NOTICE]   (145844) : New worker (145846) forked
Oct 08 19:07:31 compute-0 neutron-haproxy-ovnmeta-15690acb-54cf-4081-a718-c14a1c0af6a8[145840]: [NOTICE]   (145844) : Loading success.
Oct 08 19:07:31 compute-0 nova_compute[117514]: 2025-10-08 19:07:31.994 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950451.9940875, b66b330b-1cad-4dfb-a2f9-83201dc8ee32 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 08 19:07:31 compute-0 nova_compute[117514]: 2025-10-08 19:07:31.995 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] VM Started (Lifecycle Event)
Oct 08 19:07:31 compute-0 nova_compute[117514]: 2025-10-08 19:07:31.998 2 DEBUG nova.compute.manager [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Oct 08 19:07:32 compute-0 nova_compute[117514]: 2025-10-08 19:07:32.002 2 DEBUG nova.virt.libvirt.driver [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Oct 08 19:07:32 compute-0 nova_compute[117514]: 2025-10-08 19:07:32.006 2 INFO nova.virt.libvirt.driver [-] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Instance spawned successfully.
Oct 08 19:07:32 compute-0 nova_compute[117514]: 2025-10-08 19:07:32.008 2 DEBUG nova.virt.libvirt.driver [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Oct 08 19:07:32 compute-0 nova_compute[117514]: 2025-10-08 19:07:32.016 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:07:32 compute-0 nova_compute[117514]: 2025-10-08 19:07:32.021 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 08 19:07:32 compute-0 nova_compute[117514]: 2025-10-08 19:07:32.038 2 DEBUG nova.virt.libvirt.driver [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:07:32 compute-0 nova_compute[117514]: 2025-10-08 19:07:32.038 2 DEBUG nova.virt.libvirt.driver [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:07:32 compute-0 nova_compute[117514]: 2025-10-08 19:07:32.039 2 DEBUG nova.virt.libvirt.driver [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:07:32 compute-0 nova_compute[117514]: 2025-10-08 19:07:32.039 2 DEBUG nova.virt.libvirt.driver [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:07:32 compute-0 nova_compute[117514]: 2025-10-08 19:07:32.040 2 DEBUG nova.virt.libvirt.driver [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:07:32 compute-0 nova_compute[117514]: 2025-10-08 19:07:32.040 2 DEBUG nova.virt.libvirt.driver [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:07:32 compute-0 nova_compute[117514]: 2025-10-08 19:07:32.052 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 08 19:07:32 compute-0 nova_compute[117514]: 2025-10-08 19:07:32.053 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950451.9941866, b66b330b-1cad-4dfb-a2f9-83201dc8ee32 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 08 19:07:32 compute-0 nova_compute[117514]: 2025-10-08 19:07:32.053 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] VM Paused (Lifecycle Event)
Oct 08 19:07:32 compute-0 nova_compute[117514]: 2025-10-08 19:07:32.086 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:07:32 compute-0 nova_compute[117514]: 2025-10-08 19:07:32.091 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950452.001077, b66b330b-1cad-4dfb-a2f9-83201dc8ee32 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 08 19:07:32 compute-0 nova_compute[117514]: 2025-10-08 19:07:32.092 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] VM Resumed (Lifecycle Event)
Oct 08 19:07:32 compute-0 nova_compute[117514]: 2025-10-08 19:07:32.119 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:07:32 compute-0 nova_compute[117514]: 2025-10-08 19:07:32.125 2 INFO nova.compute.manager [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Took 9.29 seconds to spawn the instance on the hypervisor.
Oct 08 19:07:32 compute-0 nova_compute[117514]: 2025-10-08 19:07:32.125 2 DEBUG nova.compute.manager [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:07:32 compute-0 nova_compute[117514]: 2025-10-08 19:07:32.126 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 08 19:07:32 compute-0 nova_compute[117514]: 2025-10-08 19:07:32.157 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 08 19:07:32 compute-0 nova_compute[117514]: 2025-10-08 19:07:32.186 2 INFO nova.compute.manager [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Took 9.73 seconds to build instance.
Oct 08 19:07:32 compute-0 nova_compute[117514]: 2025-10-08 19:07:32.198 2 DEBUG oslo_concurrency.lockutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.840s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:07:33 compute-0 nova_compute[117514]: 2025-10-08 19:07:33.257 2 DEBUG nova.compute.manager [req-fa73bb40-f60d-4f07-94d6-d7ddc561fff0 req-c01fdfb7-8386-4b3c-b240-dd999f182d41 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Received event network-vif-plugged-0107be0e-1b4b-47dd-9422-a435ded0964c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:07:33 compute-0 nova_compute[117514]: 2025-10-08 19:07:33.257 2 DEBUG oslo_concurrency.lockutils [req-fa73bb40-f60d-4f07-94d6-d7ddc561fff0 req-c01fdfb7-8386-4b3c-b240-dd999f182d41 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:07:33 compute-0 nova_compute[117514]: 2025-10-08 19:07:33.258 2 DEBUG oslo_concurrency.lockutils [req-fa73bb40-f60d-4f07-94d6-d7ddc561fff0 req-c01fdfb7-8386-4b3c-b240-dd999f182d41 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:07:33 compute-0 nova_compute[117514]: 2025-10-08 19:07:33.259 2 DEBUG oslo_concurrency.lockutils [req-fa73bb40-f60d-4f07-94d6-d7ddc561fff0 req-c01fdfb7-8386-4b3c-b240-dd999f182d41 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:07:33 compute-0 nova_compute[117514]: 2025-10-08 19:07:33.259 2 DEBUG nova.compute.manager [req-fa73bb40-f60d-4f07-94d6-d7ddc561fff0 req-c01fdfb7-8386-4b3c-b240-dd999f182d41 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] No waiting events found dispatching network-vif-plugged-0107be0e-1b4b-47dd-9422-a435ded0964c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 08 19:07:33 compute-0 nova_compute[117514]: 2025-10-08 19:07:33.260 2 WARNING nova.compute.manager [req-fa73bb40-f60d-4f07-94d6-d7ddc561fff0 req-c01fdfb7-8386-4b3c-b240-dd999f182d41 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Received unexpected event network-vif-plugged-0107be0e-1b4b-47dd-9422-a435ded0964c for instance with vm_state active and task_state None.
Oct 08 19:07:34 compute-0 nova_compute[117514]: 2025-10-08 19:07:34.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:07:35 compute-0 nova_compute[117514]: 2025-10-08 19:07:35.542 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:07:36 compute-0 ovn_controller[19759]: 2025-10-08T19:07:36Z|00051|binding|INFO|Releasing lport b2172a75-691e-43ff-a242-3b06a5bfd197 from this chassis (sb_readonly=0)
Oct 08 19:07:36 compute-0 NetworkManager[1035]: <info>  [1759950456.4866] manager: (patch-br-int-to-provnet-64c51c9c-a066-44c7-bc3d-9c8bcfc2a465): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/37)
Oct 08 19:07:36 compute-0 NetworkManager[1035]: <info>  [1759950456.4879] manager: (patch-provnet-64c51c9c-a066-44c7-bc3d-9c8bcfc2a465-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/38)
Oct 08 19:07:36 compute-0 nova_compute[117514]: 2025-10-08 19:07:36.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:07:36 compute-0 ovn_controller[19759]: 2025-10-08T19:07:36Z|00052|binding|INFO|Releasing lport b2172a75-691e-43ff-a242-3b06a5bfd197 from this chassis (sb_readonly=0)
Oct 08 19:07:36 compute-0 nova_compute[117514]: 2025-10-08 19:07:36.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:07:36 compute-0 nova_compute[117514]: 2025-10-08 19:07:36.556 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:07:36 compute-0 podman[145857]: 2025-10-08 19:07:36.678310169 +0000 UTC m=+0.082985353 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 08 19:07:36 compute-0 nova_compute[117514]: 2025-10-08 19:07:36.796 2 DEBUG nova.compute.manager [req-d9c34c21-d660-420c-8c12-b9bd70bc0a64 req-50659bf2-2d61-487d-a504-f786c54340bc bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Received event network-changed-0107be0e-1b4b-47dd-9422-a435ded0964c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:07:36 compute-0 nova_compute[117514]: 2025-10-08 19:07:36.797 2 DEBUG nova.compute.manager [req-d9c34c21-d660-420c-8c12-b9bd70bc0a64 req-50659bf2-2d61-487d-a504-f786c54340bc bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Refreshing instance network info cache due to event network-changed-0107be0e-1b4b-47dd-9422-a435ded0964c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 08 19:07:36 compute-0 nova_compute[117514]: 2025-10-08 19:07:36.797 2 DEBUG oslo_concurrency.lockutils [req-d9c34c21-d660-420c-8c12-b9bd70bc0a64 req-50659bf2-2d61-487d-a504-f786c54340bc bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-b66b330b-1cad-4dfb-a2f9-83201dc8ee32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:07:36 compute-0 nova_compute[117514]: 2025-10-08 19:07:36.798 2 DEBUG oslo_concurrency.lockutils [req-d9c34c21-d660-420c-8c12-b9bd70bc0a64 req-50659bf2-2d61-487d-a504-f786c54340bc bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-b66b330b-1cad-4dfb-a2f9-83201dc8ee32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:07:36 compute-0 nova_compute[117514]: 2025-10-08 19:07:36.798 2 DEBUG nova.network.neutron [req-d9c34c21-d660-420c-8c12-b9bd70bc0a64 req-50659bf2-2d61-487d-a504-f786c54340bc bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Refreshing network info cache for port 0107be0e-1b4b-47dd-9422-a435ded0964c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 08 19:07:38 compute-0 nova_compute[117514]: 2025-10-08 19:07:38.257 2 DEBUG nova.network.neutron [req-d9c34c21-d660-420c-8c12-b9bd70bc0a64 req-50659bf2-2d61-487d-a504-f786c54340bc bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Updated VIF entry in instance network info cache for port 0107be0e-1b4b-47dd-9422-a435ded0964c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 08 19:07:38 compute-0 nova_compute[117514]: 2025-10-08 19:07:38.258 2 DEBUG nova.network.neutron [req-d9c34c21-d660-420c-8c12-b9bd70bc0a64 req-50659bf2-2d61-487d-a504-f786c54340bc bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Updating instance_info_cache with network_info: [{"id": "0107be0e-1b4b-47dd-9422-a435ded0964c", "address": "fa:16:3e:d7:63:9d", "network": {"id": "15690acb-54cf-4081-a718-c14a1c0af6a8", "bridge": "br-int", "label": "tempest-network-smoke--977169033", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0107be0e-1b", "ovs_interfaceid": "0107be0e-1b4b-47dd-9422-a435ded0964c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:07:38 compute-0 nova_compute[117514]: 2025-10-08 19:07:38.297 2 DEBUG oslo_concurrency.lockutils [req-d9c34c21-d660-420c-8c12-b9bd70bc0a64 req-50659bf2-2d61-487d-a504-f786c54340bc bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-b66b330b-1cad-4dfb-a2f9-83201dc8ee32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:07:39 compute-0 nova_compute[117514]: 2025-10-08 19:07:39.350 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:07:40 compute-0 nova_compute[117514]: 2025-10-08 19:07:40.545 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:07:43 compute-0 ovn_controller[19759]: 2025-10-08T19:07:43Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d7:63:9d 10.100.0.6
Oct 08 19:07:43 compute-0 ovn_controller[19759]: 2025-10-08T19:07:43Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d7:63:9d 10.100.0.6
Oct 08 19:07:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:44.228 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:07:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:44.229 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:07:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:44.230 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:07:44 compute-0 nova_compute[117514]: 2025-10-08 19:07:44.353 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:07:45 compute-0 nova_compute[117514]: 2025-10-08 19:07:45.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:07:46 compute-0 podman[145899]: 2025-10-08 19:07:46.682038475 +0000 UTC m=+0.094762517 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 08 19:07:49 compute-0 nova_compute[117514]: 2025-10-08 19:07:49.117 2 INFO nova.compute.manager [None req-71e8e984-2524-4caf-8dc5-182df0d9ea11 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Get console output
Oct 08 19:07:49 compute-0 nova_compute[117514]: 2025-10-08 19:07:49.126 54 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Oct 08 19:07:49 compute-0 nova_compute[117514]: 2025-10-08 19:07:49.357 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:07:50 compute-0 nova_compute[117514]: 2025-10-08 19:07:50.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:07:52 compute-0 podman[145923]: 2025-10-08 19:07:52.676441624 +0000 UTC m=+0.092589922 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, managed_by=edpm_ansible, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, vcs-type=git, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers)
Oct 08 19:07:52 compute-0 podman[145924]: 2025-10-08 19:07:52.680727653 +0000 UTC m=+0.091333184 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 08 19:07:53 compute-0 nova_compute[117514]: 2025-10-08 19:07:53.292 2 DEBUG oslo_concurrency.lockutils [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "interface-b66b330b-1cad-4dfb-a2f9-83201dc8ee32-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:07:53 compute-0 nova_compute[117514]: 2025-10-08 19:07:53.292 2 DEBUG oslo_concurrency.lockutils [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "interface-b66b330b-1cad-4dfb-a2f9-83201dc8ee32-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:07:53 compute-0 nova_compute[117514]: 2025-10-08 19:07:53.292 2 DEBUG nova.objects.instance [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'flavor' on Instance uuid b66b330b-1cad-4dfb-a2f9-83201dc8ee32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 08 19:07:54 compute-0 nova_compute[117514]: 2025-10-08 19:07:54.196 2 DEBUG nova.objects.instance [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'pci_requests' on Instance uuid b66b330b-1cad-4dfb-a2f9-83201dc8ee32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 08 19:07:54 compute-0 nova_compute[117514]: 2025-10-08 19:07:54.213 2 DEBUG nova.network.neutron [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Oct 08 19:07:54 compute-0 nova_compute[117514]: 2025-10-08 19:07:54.350 2 DEBUG nova.policy [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Oct 08 19:07:54 compute-0 nova_compute[117514]: 2025-10-08 19:07:54.365 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:07:55 compute-0 nova_compute[117514]: 2025-10-08 19:07:55.373 2 DEBUG nova.network.neutron [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Successfully created port: 6943627d-6614-41cb-9460-f0454c6defb1 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Oct 08 19:07:55 compute-0 nova_compute[117514]: 2025-10-08 19:07:55.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:07:56 compute-0 podman[145966]: 2025-10-08 19:07:56.621422006 +0000 UTC m=+0.049655182 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 08 19:07:57 compute-0 nova_compute[117514]: 2025-10-08 19:07:57.098 2 DEBUG nova.network.neutron [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Successfully updated port: 6943627d-6614-41cb-9460-f0454c6defb1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Oct 08 19:07:57 compute-0 nova_compute[117514]: 2025-10-08 19:07:57.115 2 DEBUG oslo_concurrency.lockutils [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "refresh_cache-b66b330b-1cad-4dfb-a2f9-83201dc8ee32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:07:57 compute-0 nova_compute[117514]: 2025-10-08 19:07:57.115 2 DEBUG oslo_concurrency.lockutils [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquired lock "refresh_cache-b66b330b-1cad-4dfb-a2f9-83201dc8ee32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:07:57 compute-0 nova_compute[117514]: 2025-10-08 19:07:57.115 2 DEBUG nova.network.neutron [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 08 19:07:57 compute-0 nova_compute[117514]: 2025-10-08 19:07:57.224 2 DEBUG nova.compute.manager [req-1731de79-2251-41c4-8ced-d39a834649c9 req-79a98552-bcdc-46fd-b25f-317a3475f875 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Received event network-changed-6943627d-6614-41cb-9460-f0454c6defb1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:07:57 compute-0 nova_compute[117514]: 2025-10-08 19:07:57.225 2 DEBUG nova.compute.manager [req-1731de79-2251-41c4-8ced-d39a834649c9 req-79a98552-bcdc-46fd-b25f-317a3475f875 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Refreshing instance network info cache due to event network-changed-6943627d-6614-41cb-9460-f0454c6defb1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 08 19:07:57 compute-0 nova_compute[117514]: 2025-10-08 19:07:57.225 2 DEBUG oslo_concurrency.lockutils [req-1731de79-2251-41c4-8ced-d39a834649c9 req-79a98552-bcdc-46fd-b25f-317a3475f875 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-b66b330b-1cad-4dfb-a2f9-83201dc8ee32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:07:58 compute-0 podman[145991]: 2025-10-08 19:07:58.672930666 +0000 UTC m=+0.076791338 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct 08 19:07:58 compute-0 podman[145990]: 2025-10-08 19:07:58.698307798 +0000 UTC m=+0.108583652 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3)
Oct 08 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.368 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.530 2 DEBUG nova.network.neutron [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Updating instance_info_cache with network_info: [{"id": "0107be0e-1b4b-47dd-9422-a435ded0964c", "address": "fa:16:3e:d7:63:9d", "network": {"id": "15690acb-54cf-4081-a718-c14a1c0af6a8", "bridge": "br-int", "label": "tempest-network-smoke--977169033", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0107be0e-1b", "ovs_interfaceid": "0107be0e-1b4b-47dd-9422-a435ded0964c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6943627d-6614-41cb-9460-f0454c6defb1", "address": "fa:16:3e:bc:a5:e4", "network": {"id": "c73d9547-8a91-4802-82a8-1a3a035fe63c", "bridge": "br-int", "label": "tempest-network-smoke--833981410", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6943627d-66", "ovs_interfaceid": "6943627d-6614-41cb-9460-f0454c6defb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.550 2 DEBUG oslo_concurrency.lockutils [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Releasing lock "refresh_cache-b66b330b-1cad-4dfb-a2f9-83201dc8ee32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.552 2 DEBUG oslo_concurrency.lockutils [req-1731de79-2251-41c4-8ced-d39a834649c9 req-79a98552-bcdc-46fd-b25f-317a3475f875 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-b66b330b-1cad-4dfb-a2f9-83201dc8ee32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.553 2 DEBUG nova.network.neutron [req-1731de79-2251-41c4-8ced-d39a834649c9 req-79a98552-bcdc-46fd-b25f-317a3475f875 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Refreshing network info cache for port 6943627d-6614-41cb-9460-f0454c6defb1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 08 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.557 2 DEBUG nova.virt.libvirt.vif [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T19:07:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-602516393',display_name='tempest-TestNetworkBasicOps-server-602516393',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-602516393',id=3,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC1aTvBPkgf1VfNjvC4uuKKg+tISnXImijmvs2cGp+FtgeRrvxYh9lBxLRU9xSzH0Z6LaCabBaf6NwgK+eU8uEwumcvsX4qsd2EcbV6VjIknh+8LBbcMTdQeQSSFJx6qhQ==',key_name='tempest-TestNetworkBasicOps-1029193278',keypairs=<?>,launch_index=0,launched_at=2025-10-08T19:07:32Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-dwebwbaf',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T19:07:32Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=b66b330b-1cad-4dfb-a2f9-83201dc8ee32,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6943627d-6614-41cb-9460-f0454c6defb1", "address": "fa:16:3e:bc:a5:e4", "network": {"id": "c73d9547-8a91-4802-82a8-1a3a035fe63c", "bridge": "br-int", "label": "tempest-network-smoke--833981410", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6943627d-66", "ovs_interfaceid": "6943627d-6614-41cb-9460-f0454c6defb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Oct 08 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.558 2 DEBUG nova.network.os_vif_util [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "6943627d-6614-41cb-9460-f0454c6defb1", "address": "fa:16:3e:bc:a5:e4", "network": {"id": "c73d9547-8a91-4802-82a8-1a3a035fe63c", "bridge": "br-int", "label": "tempest-network-smoke--833981410", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6943627d-66", "ovs_interfaceid": "6943627d-6614-41cb-9460-f0454c6defb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 08 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.559 2 DEBUG nova.network.os_vif_util [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bc:a5:e4,bridge_name='br-int',has_traffic_filtering=True,id=6943627d-6614-41cb-9460-f0454c6defb1,network=Network(c73d9547-8a91-4802-82a8-1a3a035fe63c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6943627d-66') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 08 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.560 2 DEBUG os_vif [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bc:a5:e4,bridge_name='br-int',has_traffic_filtering=True,id=6943627d-6614-41cb-9460-f0454c6defb1,network=Network(c73d9547-8a91-4802-82a8-1a3a035fe63c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6943627d-66') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Oct 08 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.562 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.563 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.563 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 08 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.573 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.574 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6943627d-66, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.575 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6943627d-66, col_values=(('external_ids', {'iface-id': '6943627d-6614-41cb-9460-f0454c6defb1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:bc:a5:e4', 'vm-uuid': 'b66b330b-1cad-4dfb-a2f9-83201dc8ee32'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.576 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:07:59 compute-0 NetworkManager[1035]: <info>  [1759950479.5777] manager: (tap6943627d-66): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/39)
Oct 08 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.579 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 08 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.586 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.587 2 INFO os_vif [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bc:a5:e4,bridge_name='br-int',has_traffic_filtering=True,id=6943627d-6614-41cb-9460-f0454c6defb1,network=Network(c73d9547-8a91-4802-82a8-1a3a035fe63c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6943627d-66')
Oct 08 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.588 2 DEBUG nova.virt.libvirt.vif [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T19:07:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-602516393',display_name='tempest-TestNetworkBasicOps-server-602516393',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-602516393',id=3,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC1aTvBPkgf1VfNjvC4uuKKg+tISnXImijmvs2cGp+FtgeRrvxYh9lBxLRU9xSzH0Z6LaCabBaf6NwgK+eU8uEwumcvsX4qsd2EcbV6VjIknh+8LBbcMTdQeQSSFJx6qhQ==',key_name='tempest-TestNetworkBasicOps-1029193278',keypairs=<?>,launch_index=0,launched_at=2025-10-08T19:07:32Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-dwebwbaf',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T19:07:32Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=b66b330b-1cad-4dfb-a2f9-83201dc8ee32,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6943627d-6614-41cb-9460-f0454c6defb1", "address": "fa:16:3e:bc:a5:e4", "network": {"id": "c73d9547-8a91-4802-82a8-1a3a035fe63c", "bridge": "br-int", "label": "tempest-network-smoke--833981410", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6943627d-66", "ovs_interfaceid": "6943627d-6614-41cb-9460-f0454c6defb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Oct 08 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.589 2 DEBUG nova.network.os_vif_util [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "6943627d-6614-41cb-9460-f0454c6defb1", "address": "fa:16:3e:bc:a5:e4", "network": {"id": "c73d9547-8a91-4802-82a8-1a3a035fe63c", "bridge": "br-int", "label": "tempest-network-smoke--833981410", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6943627d-66", "ovs_interfaceid": "6943627d-6614-41cb-9460-f0454c6defb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 08 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.589 2 DEBUG nova.network.os_vif_util [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bc:a5:e4,bridge_name='br-int',has_traffic_filtering=True,id=6943627d-6614-41cb-9460-f0454c6defb1,network=Network(c73d9547-8a91-4802-82a8-1a3a035fe63c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6943627d-66') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 08 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.592 2 DEBUG nova.virt.libvirt.guest [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] attach device xml: <interface type="ethernet">
Oct 08 19:07:59 compute-0 nova_compute[117514]:   <mac address="fa:16:3e:bc:a5:e4"/>
Oct 08 19:07:59 compute-0 nova_compute[117514]:   <model type="virtio"/>
Oct 08 19:07:59 compute-0 nova_compute[117514]:   <driver name="vhost" rx_queue_size="512"/>
Oct 08 19:07:59 compute-0 nova_compute[117514]:   <mtu size="1442"/>
Oct 08 19:07:59 compute-0 nova_compute[117514]:   <target dev="tap6943627d-66"/>
Oct 08 19:07:59 compute-0 nova_compute[117514]: </interface>
Oct 08 19:07:59 compute-0 nova_compute[117514]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339
Oct 08 19:07:59 compute-0 kernel: tap6943627d-66: entered promiscuous mode
Oct 08 19:07:59 compute-0 NetworkManager[1035]: <info>  [1759950479.6070] manager: (tap6943627d-66): new Tun device (/org/freedesktop/NetworkManager/Devices/40)
Oct 08 19:07:59 compute-0 ovn_controller[19759]: 2025-10-08T19:07:59Z|00053|binding|INFO|Claiming lport 6943627d-6614-41cb-9460-f0454c6defb1 for this chassis.
Oct 08 19:07:59 compute-0 ovn_controller[19759]: 2025-10-08T19:07:59Z|00054|binding|INFO|6943627d-6614-41cb-9460-f0454c6defb1: Claiming fa:16:3e:bc:a5:e4 10.100.0.29
Oct 08 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.608 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:07:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:59.618 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bc:a5:e4 10.100.0.29'], port_security=['fa:16:3e:bc:a5:e4 10.100.0.29'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.29/28', 'neutron:device_id': 'b66b330b-1cad-4dfb-a2f9-83201dc8ee32', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c73d9547-8a91-4802-82a8-1a3a035fe63c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'be57f10c-6afc-483d-a1fa-fab953b8fe3e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4df9aed3-d2c0-400e-9a01-f8aebdd77f61, chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=6943627d-6614-41cb-9460-f0454c6defb1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 08 19:07:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:59.620 28643 INFO neutron.agent.ovn.metadata.agent [-] Port 6943627d-6614-41cb-9460-f0454c6defb1 in datapath c73d9547-8a91-4802-82a8-1a3a035fe63c bound to our chassis
Oct 08 19:07:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:59.621 28643 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c73d9547-8a91-4802-82a8-1a3a035fe63c
Oct 08 19:07:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:59.638 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[c9d9ab4f-f9f8-46bf-9306-85f6bee831d2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:07:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:59.640 28643 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc73d9547-81 in ovnmeta-c73d9547-8a91-4802-82a8-1a3a035fe63c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Oct 08 19:07:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:59.644 144726 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc73d9547-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Oct 08 19:07:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:59.645 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[42273a2d-a8ca-4b5b-b10f-8d6e91a9529c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:07:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:59.646 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[343c0c4c-1f14-4bc7-9fc7-ee22316609eb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:07:59 compute-0 systemd-udevd[146037]: Network interface NamePolicy= disabled on kernel command line.
Oct 08 19:07:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:59.668 28783 DEBUG oslo.privsep.daemon [-] privsep: reply[1675ad6f-cd4d-4ecb-91f9-19f21f32a2f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.677 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:07:59 compute-0 ovn_controller[19759]: 2025-10-08T19:07:59Z|00055|binding|INFO|Setting lport 6943627d-6614-41cb-9460-f0454c6defb1 ovn-installed in OVS
Oct 08 19:07:59 compute-0 NetworkManager[1035]: <info>  [1759950479.6833] device (tap6943627d-66): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 08 19:07:59 compute-0 ovn_controller[19759]: 2025-10-08T19:07:59Z|00056|binding|INFO|Setting lport 6943627d-6614-41cb-9460-f0454c6defb1 up in Southbound
Oct 08 19:07:59 compute-0 NetworkManager[1035]: <info>  [1759950479.6841] device (tap6943627d-66): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 08 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.685 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:07:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:59.701 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[ce478e36-ae47-4de9-92e7-24356054d7c9]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.736 2 DEBUG nova.virt.libvirt.driver [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 08 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.736 2 DEBUG nova.virt.libvirt.driver [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 08 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.737 2 DEBUG nova.virt.libvirt.driver [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No VIF found with MAC fa:16:3e:d7:63:9d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Oct 08 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.737 2 DEBUG nova.virt.libvirt.driver [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No VIF found with MAC fa:16:3e:bc:a5:e4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Oct 08 19:07:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:59.740 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[aa9afb3c-8f76-4acc-9f67-d7f5e7db329f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:07:59 compute-0 systemd-udevd[146040]: Network interface NamePolicy= disabled on kernel command line.
Oct 08 19:07:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:59.746 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[9db9c83b-45f9-4f7f-9fae-eedf56774ed9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:07:59 compute-0 NetworkManager[1035]: <info>  [1759950479.7477] manager: (tapc73d9547-80): new Veth device (/org/freedesktop/NetworkManager/Devices/41)
Oct 08 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.762 2 DEBUG nova.virt.libvirt.guest [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 08 19:07:59 compute-0 nova_compute[117514]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct 08 19:07:59 compute-0 nova_compute[117514]:   <nova:name>tempest-TestNetworkBasicOps-server-602516393</nova:name>
Oct 08 19:07:59 compute-0 nova_compute[117514]:   <nova:creationTime>2025-10-08 19:07:59</nova:creationTime>
Oct 08 19:07:59 compute-0 nova_compute[117514]:   <nova:flavor name="m1.nano">
Oct 08 19:07:59 compute-0 nova_compute[117514]:     <nova:memory>128</nova:memory>
Oct 08 19:07:59 compute-0 nova_compute[117514]:     <nova:disk>1</nova:disk>
Oct 08 19:07:59 compute-0 nova_compute[117514]:     <nova:swap>0</nova:swap>
Oct 08 19:07:59 compute-0 nova_compute[117514]:     <nova:ephemeral>0</nova:ephemeral>
Oct 08 19:07:59 compute-0 nova_compute[117514]:     <nova:vcpus>1</nova:vcpus>
Oct 08 19:07:59 compute-0 nova_compute[117514]:   </nova:flavor>
Oct 08 19:07:59 compute-0 nova_compute[117514]:   <nova:owner>
Oct 08 19:07:59 compute-0 nova_compute[117514]:     <nova:user uuid="efdb1424acdb478684cdb088b373ba05">tempest-TestNetworkBasicOps-1122149477-project-member</nova:user>
Oct 08 19:07:59 compute-0 nova_compute[117514]:     <nova:project uuid="b7f7c752a9c5498f8eda73e461895ac9">tempest-TestNetworkBasicOps-1122149477</nova:project>
Oct 08 19:07:59 compute-0 nova_compute[117514]:   </nova:owner>
Oct 08 19:07:59 compute-0 nova_compute[117514]:   <nova:root type="image" uuid="23cfa426-7011-4566-992d-1c7af39f70dd"/>
Oct 08 19:07:59 compute-0 nova_compute[117514]:   <nova:ports>
Oct 08 19:07:59 compute-0 nova_compute[117514]:     <nova:port uuid="0107be0e-1b4b-47dd-9422-a435ded0964c">
Oct 08 19:07:59 compute-0 nova_compute[117514]:       <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct 08 19:07:59 compute-0 nova_compute[117514]:     </nova:port>
Oct 08 19:07:59 compute-0 nova_compute[117514]:     <nova:port uuid="6943627d-6614-41cb-9460-f0454c6defb1">
Oct 08 19:07:59 compute-0 nova_compute[117514]:       <nova:ip type="fixed" address="10.100.0.29" ipVersion="4"/>
Oct 08 19:07:59 compute-0 nova_compute[117514]:     </nova:port>
Oct 08 19:07:59 compute-0 nova_compute[117514]:   </nova:ports>
Oct 08 19:07:59 compute-0 nova_compute[117514]: </nova:instance>
Oct 08 19:07:59 compute-0 nova_compute[117514]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Oct 08 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.790 2 DEBUG oslo_concurrency.lockutils [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "interface-b66b330b-1cad-4dfb-a2f9-83201dc8ee32-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 6.498s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:07:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:59.799 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[0b566470-a0d5-4810-b7d6-b301ee2109a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:07:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:59.802 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[e17852c6-1877-4c60-ab48-dfea238bfe31]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:07:59 compute-0 NetworkManager[1035]: <info>  [1759950479.8395] device (tapc73d9547-80): carrier: link connected
Oct 08 19:07:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:59.850 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[e19ebe3e-7cdb-44c8-a41c-f373770b0b4f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:07:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:59.875 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[5fb3443e-cdb4-4fe1-997d-1b09ddc09b71]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc73d9547-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3f:68:c4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 115041, 'reachable_time': 35171, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 146081, 'error': None, 'target': 'ovnmeta-c73d9547-8a91-4802-82a8-1a3a035fe63c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:07:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:59.897 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[1dce7aff-cd74-422e-8acd-407a1e60ec92]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe3f:68c4'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 115041, 'tstamp': 115041}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 146086, 'error': None, 'target': 'ovnmeta-c73d9547-8a91-4802-82a8-1a3a035fe63c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:07:59 compute-0 podman[146053]: 2025-10-08 19:07:59.909839332 +0000 UTC m=+0.117814149 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 08 19:07:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:59.923 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[693e2f04-f15a-415f-94be-615b5d8a0eac]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc73d9547-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3f:68:c4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 115041, 'reachable_time': 35171, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 146090, 'error': None, 'target': 'ovnmeta-c73d9547-8a91-4802-82a8-1a3a035fe63c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:07:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:59.956 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[69905e99-2329-4dbb-bfdf-ecf3aab571c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:08:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:00.036 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[dd31f513-c07e-461f-a0e9-faf00a637cbe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:08:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:00.038 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc73d9547-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:08:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:00.039 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 08 19:08:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:00.039 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc73d9547-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:08:00 compute-0 nova_compute[117514]: 2025-10-08 19:08:00.041 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:00 compute-0 kernel: tapc73d9547-80: entered promiscuous mode
Oct 08 19:08:00 compute-0 NetworkManager[1035]: <info>  [1759950480.0428] manager: (tapc73d9547-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/42)
Oct 08 19:08:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:00.046 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc73d9547-80, col_values=(('external_ids', {'iface-id': 'c436eb15-2527-4c5e-bb8f-6f582c6a8cdd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:08:00 compute-0 nova_compute[117514]: 2025-10-08 19:08:00.047 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:00 compute-0 ovn_controller[19759]: 2025-10-08T19:08:00Z|00057|binding|INFO|Releasing lport c436eb15-2527-4c5e-bb8f-6f582c6a8cdd from this chassis (sb_readonly=0)
Oct 08 19:08:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:00.049 28643 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c73d9547-8a91-4802-82a8-1a3a035fe63c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c73d9547-8a91-4802-82a8-1a3a035fe63c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Oct 08 19:08:00 compute-0 nova_compute[117514]: 2025-10-08 19:08:00.049 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:00.050 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[6288806b-d32c-4714-8c06-87d1b3199b2b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:08:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:00.051 28643 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct 08 19:08:00 compute-0 ovn_metadata_agent[28637]: global
Oct 08 19:08:00 compute-0 ovn_metadata_agent[28637]:     log         /dev/log local0 debug
Oct 08 19:08:00 compute-0 ovn_metadata_agent[28637]:     log-tag     haproxy-metadata-proxy-c73d9547-8a91-4802-82a8-1a3a035fe63c
Oct 08 19:08:00 compute-0 ovn_metadata_agent[28637]:     user        root
Oct 08 19:08:00 compute-0 ovn_metadata_agent[28637]:     group       root
Oct 08 19:08:00 compute-0 ovn_metadata_agent[28637]:     maxconn     1024
Oct 08 19:08:00 compute-0 ovn_metadata_agent[28637]:     pidfile     /var/lib/neutron/external/pids/c73d9547-8a91-4802-82a8-1a3a035fe63c.pid.haproxy
Oct 08 19:08:00 compute-0 ovn_metadata_agent[28637]:     daemon
Oct 08 19:08:00 compute-0 ovn_metadata_agent[28637]: 
Oct 08 19:08:00 compute-0 ovn_metadata_agent[28637]: defaults
Oct 08 19:08:00 compute-0 ovn_metadata_agent[28637]:     log global
Oct 08 19:08:00 compute-0 ovn_metadata_agent[28637]:     mode http
Oct 08 19:08:00 compute-0 ovn_metadata_agent[28637]:     option httplog
Oct 08 19:08:00 compute-0 ovn_metadata_agent[28637]:     option dontlognull
Oct 08 19:08:00 compute-0 ovn_metadata_agent[28637]:     option http-server-close
Oct 08 19:08:00 compute-0 ovn_metadata_agent[28637]:     option forwardfor
Oct 08 19:08:00 compute-0 ovn_metadata_agent[28637]:     retries                 3
Oct 08 19:08:00 compute-0 ovn_metadata_agent[28637]:     timeout http-request    30s
Oct 08 19:08:00 compute-0 ovn_metadata_agent[28637]:     timeout connect         30s
Oct 08 19:08:00 compute-0 ovn_metadata_agent[28637]:     timeout client          32s
Oct 08 19:08:00 compute-0 ovn_metadata_agent[28637]:     timeout server          32s
Oct 08 19:08:00 compute-0 ovn_metadata_agent[28637]:     timeout http-keep-alive 30s
Oct 08 19:08:00 compute-0 ovn_metadata_agent[28637]: 
Oct 08 19:08:00 compute-0 ovn_metadata_agent[28637]: 
Oct 08 19:08:00 compute-0 ovn_metadata_agent[28637]: listen listener
Oct 08 19:08:00 compute-0 ovn_metadata_agent[28637]:     bind 169.254.169.254:80
Oct 08 19:08:00 compute-0 ovn_metadata_agent[28637]:     server metadata /var/lib/neutron/metadata_proxy
Oct 08 19:08:00 compute-0 ovn_metadata_agent[28637]:     http-request add-header X-OVN-Network-ID c73d9547-8a91-4802-82a8-1a3a035fe63c
Oct 08 19:08:00 compute-0 ovn_metadata_agent[28637]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Oct 08 19:08:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:00.051 28643 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c73d9547-8a91-4802-82a8-1a3a035fe63c', 'env', 'PROCESS_TAG=haproxy-c73d9547-8a91-4802-82a8-1a3a035fe63c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c73d9547-8a91-4802-82a8-1a3a035fe63c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Oct 08 19:08:00 compute-0 nova_compute[117514]: 2025-10-08 19:08:00.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:00 compute-0 nova_compute[117514]: 2025-10-08 19:08:00.315 2 DEBUG nova.compute.manager [req-1bc7b36b-1905-4bcd-8f67-cfb96eb3762d req-e6765698-c1d3-48c7-8bfb-c9baa6b8da8e bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Received event network-vif-plugged-6943627d-6614-41cb-9460-f0454c6defb1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:08:00 compute-0 nova_compute[117514]: 2025-10-08 19:08:00.316 2 DEBUG oslo_concurrency.lockutils [req-1bc7b36b-1905-4bcd-8f67-cfb96eb3762d req-e6765698-c1d3-48c7-8bfb-c9baa6b8da8e bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:08:00 compute-0 nova_compute[117514]: 2025-10-08 19:08:00.317 2 DEBUG oslo_concurrency.lockutils [req-1bc7b36b-1905-4bcd-8f67-cfb96eb3762d req-e6765698-c1d3-48c7-8bfb-c9baa6b8da8e bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:08:00 compute-0 nova_compute[117514]: 2025-10-08 19:08:00.317 2 DEBUG oslo_concurrency.lockutils [req-1bc7b36b-1905-4bcd-8f67-cfb96eb3762d req-e6765698-c1d3-48c7-8bfb-c9baa6b8da8e bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:08:00 compute-0 nova_compute[117514]: 2025-10-08 19:08:00.317 2 DEBUG nova.compute.manager [req-1bc7b36b-1905-4bcd-8f67-cfb96eb3762d req-e6765698-c1d3-48c7-8bfb-c9baa6b8da8e bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] No waiting events found dispatching network-vif-plugged-6943627d-6614-41cb-9460-f0454c6defb1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 08 19:08:00 compute-0 nova_compute[117514]: 2025-10-08 19:08:00.318 2 WARNING nova.compute.manager [req-1bc7b36b-1905-4bcd-8f67-cfb96eb3762d req-e6765698-c1d3-48c7-8bfb-c9baa6b8da8e bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Received unexpected event network-vif-plugged-6943627d-6614-41cb-9460-f0454c6defb1 for instance with vm_state active and task_state None.
Oct 08 19:08:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:00.420 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a6:75:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '5e:14:dd:63:55:2a'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 08 19:08:00 compute-0 nova_compute[117514]: 2025-10-08 19:08:00.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:00 compute-0 podman[146122]: 2025-10-08 19:08:00.489070587 +0000 UTC m=+0.098926122 container create e8c6faf61c8db69d560312908f590b9785d8039ff6bc8873fa27b59c76a83637 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-c73d9547-8a91-4802-82a8-1a3a035fe63c, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 08 19:08:00 compute-0 podman[146122]: 2025-10-08 19:08:00.431406385 +0000 UTC m=+0.041261960 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7
Oct 08 19:08:00 compute-0 systemd[1]: Started libpod-conmon-e8c6faf61c8db69d560312908f590b9785d8039ff6bc8873fa27b59c76a83637.scope.
Oct 08 19:08:00 compute-0 nova_compute[117514]: 2025-10-08 19:08:00.557 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:00 compute-0 systemd[1]: Started libcrun container.
Oct 08 19:08:00 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/274f69a4d16ffcd141ca0c2aabf5d247e94ba26d928301563c5c3be5cc17c132/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 08 19:08:00 compute-0 podman[146122]: 2025-10-08 19:08:00.605915106 +0000 UTC m=+0.215770671 container init e8c6faf61c8db69d560312908f590b9785d8039ff6bc8873fa27b59c76a83637 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-c73d9547-8a91-4802-82a8-1a3a035fe63c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 08 19:08:00 compute-0 podman[146122]: 2025-10-08 19:08:00.634655079 +0000 UTC m=+0.244510614 container start e8c6faf61c8db69d560312908f590b9785d8039ff6bc8873fa27b59c76a83637 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-c73d9547-8a91-4802-82a8-1a3a035fe63c, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct 08 19:08:00 compute-0 neutron-haproxy-ovnmeta-c73d9547-8a91-4802-82a8-1a3a035fe63c[146137]: [NOTICE]   (146141) : New worker (146143) forked
Oct 08 19:08:00 compute-0 neutron-haproxy-ovnmeta-c73d9547-8a91-4802-82a8-1a3a035fe63c[146137]: [NOTICE]   (146141) : Loading success.
Oct 08 19:08:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:00.710 28643 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 08 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.319 2 DEBUG nova.network.neutron [req-1731de79-2251-41c4-8ced-d39a834649c9 req-79a98552-bcdc-46fd-b25f-317a3475f875 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Updated VIF entry in instance network info cache for port 6943627d-6614-41cb-9460-f0454c6defb1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 08 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.320 2 DEBUG nova.network.neutron [req-1731de79-2251-41c4-8ced-d39a834649c9 req-79a98552-bcdc-46fd-b25f-317a3475f875 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Updating instance_info_cache with network_info: [{"id": "0107be0e-1b4b-47dd-9422-a435ded0964c", "address": "fa:16:3e:d7:63:9d", "network": {"id": "15690acb-54cf-4081-a718-c14a1c0af6a8", "bridge": "br-int", "label": "tempest-network-smoke--977169033", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0107be0e-1b", "ovs_interfaceid": "0107be0e-1b4b-47dd-9422-a435ded0964c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6943627d-6614-41cb-9460-f0454c6defb1", "address": "fa:16:3e:bc:a5:e4", "network": {"id": "c73d9547-8a91-4802-82a8-1a3a035fe63c", "bridge": "br-int", "label": "tempest-network-smoke--833981410", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6943627d-66", "ovs_interfaceid": "6943627d-6614-41cb-9460-f0454c6defb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.337 2 DEBUG oslo_concurrency.lockutils [req-1731de79-2251-41c4-8ced-d39a834649c9 req-79a98552-bcdc-46fd-b25f-317a3475f875 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-b66b330b-1cad-4dfb-a2f9-83201dc8ee32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:08:01 compute-0 ovn_controller[19759]: 2025-10-08T19:08:01Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:bc:a5:e4 10.100.0.29
Oct 08 19:08:01 compute-0 ovn_controller[19759]: 2025-10-08T19:08:01Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:bc:a5:e4 10.100.0.29
Oct 08 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.452 2 DEBUG oslo_concurrency.lockutils [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "interface-b66b330b-1cad-4dfb-a2f9-83201dc8ee32-6943627d-6614-41cb-9460-f0454c6defb1" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.453 2 DEBUG oslo_concurrency.lockutils [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "interface-b66b330b-1cad-4dfb-a2f9-83201dc8ee32-6943627d-6614-41cb-9460-f0454c6defb1" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.469 2 DEBUG nova.objects.instance [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'flavor' on Instance uuid b66b330b-1cad-4dfb-a2f9-83201dc8ee32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 08 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.491 2 DEBUG nova.virt.libvirt.vif [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T19:07:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-602516393',display_name='tempest-TestNetworkBasicOps-server-602516393',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-602516393',id=3,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC1aTvBPkgf1VfNjvC4uuKKg+tISnXImijmvs2cGp+FtgeRrvxYh9lBxLRU9xSzH0Z6LaCabBaf6NwgK+eU8uEwumcvsX4qsd2EcbV6VjIknh+8LBbcMTdQeQSSFJx6qhQ==',key_name='tempest-TestNetworkBasicOps-1029193278',keypairs=<?>,launch_index=0,launched_at=2025-10-08T19:07:32Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-dwebwbaf',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T19:07:32Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=b66b330b-1cad-4dfb-a2f9-83201dc8ee32,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6943627d-6614-41cb-9460-f0454c6defb1", "address": "fa:16:3e:bc:a5:e4", "network": {"id": "c73d9547-8a91-4802-82a8-1a3a035fe63c", "bridge": "br-int", "label": "tempest-network-smoke--833981410", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6943627d-66", "ovs_interfaceid": "6943627d-6614-41cb-9460-f0454c6defb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Oct 08 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.492 2 DEBUG nova.network.os_vif_util [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "6943627d-6614-41cb-9460-f0454c6defb1", "address": "fa:16:3e:bc:a5:e4", "network": {"id": "c73d9547-8a91-4802-82a8-1a3a035fe63c", "bridge": "br-int", "label": "tempest-network-smoke--833981410", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6943627d-66", "ovs_interfaceid": "6943627d-6614-41cb-9460-f0454c6defb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 08 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.493 2 DEBUG nova.network.os_vif_util [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bc:a5:e4,bridge_name='br-int',has_traffic_filtering=True,id=6943627d-6614-41cb-9460-f0454c6defb1,network=Network(c73d9547-8a91-4802-82a8-1a3a035fe63c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6943627d-66') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 08 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.498 2 DEBUG nova.virt.libvirt.guest [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:bc:a5:e4"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap6943627d-66"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Oct 08 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.501 2 DEBUG nova.virt.libvirt.guest [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:bc:a5:e4"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap6943627d-66"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Oct 08 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.505 2 DEBUG nova.virt.libvirt.driver [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Attempting to detach device tap6943627d-66 from instance b66b330b-1cad-4dfb-a2f9-83201dc8ee32 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487
Oct 08 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.506 2 DEBUG nova.virt.libvirt.guest [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] detach device xml: <interface type="ethernet">
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <mac address="fa:16:3e:bc:a5:e4"/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <model type="virtio"/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <driver name="vhost" rx_queue_size="512"/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <mtu size="1442"/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <target dev="tap6943627d-66"/>
Oct 08 19:08:01 compute-0 nova_compute[117514]: </interface>
Oct 08 19:08:01 compute-0 nova_compute[117514]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Oct 08 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.513 2 DEBUG nova.virt.libvirt.guest [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:bc:a5:e4"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap6943627d-66"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Oct 08 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.518 2 DEBUG nova.virt.libvirt.guest [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:bc:a5:e4"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap6943627d-66"/></interface>not found in domain: <domain type='kvm' id='3'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <name>instance-00000003</name>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <uuid>b66b330b-1cad-4dfb-a2f9-83201dc8ee32</uuid>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <metadata>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <nova:name>tempest-TestNetworkBasicOps-server-602516393</nova:name>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <nova:creationTime>2025-10-08 19:07:59</nova:creationTime>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <nova:flavor name="m1.nano">
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <nova:memory>128</nova:memory>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <nova:disk>1</nova:disk>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <nova:swap>0</nova:swap>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <nova:ephemeral>0</nova:ephemeral>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <nova:vcpus>1</nova:vcpus>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   </nova:flavor>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <nova:owner>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <nova:user uuid="efdb1424acdb478684cdb088b373ba05">tempest-TestNetworkBasicOps-1122149477-project-member</nova:user>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <nova:project uuid="b7f7c752a9c5498f8eda73e461895ac9">tempest-TestNetworkBasicOps-1122149477</nova:project>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   </nova:owner>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <nova:root type="image" uuid="23cfa426-7011-4566-992d-1c7af39f70dd"/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <nova:ports>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <nova:port uuid="0107be0e-1b4b-47dd-9422-a435ded0964c">
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </nova:port>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <nova:port uuid="6943627d-6614-41cb-9460-f0454c6defb1">
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <nova:ip type="fixed" address="10.100.0.29" ipVersion="4"/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </nova:port>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   </nova:ports>
Oct 08 19:08:01 compute-0 nova_compute[117514]: </nova:instance>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   </metadata>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <memory unit='KiB'>131072</memory>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <currentMemory unit='KiB'>131072</currentMemory>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <vcpu placement='static'>1</vcpu>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <resource>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <partition>/machine</partition>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   </resource>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <sysinfo type='smbios'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <system>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <entry name='manufacturer'>RDO</entry>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <entry name='product'>OpenStack Compute</entry>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <entry name='serial'>b66b330b-1cad-4dfb-a2f9-83201dc8ee32</entry>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <entry name='uuid'>b66b330b-1cad-4dfb-a2f9-83201dc8ee32</entry>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <entry name='family'>Virtual Machine</entry>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </system>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   </sysinfo>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <os>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <type arch='x86_64' machine='pc-q35-rhel9.6.0'>hvm</type>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <boot dev='hd'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <smbios mode='sysinfo'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   </os>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <features>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <acpi/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <apic/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <vmcoreinfo state='on'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   </features>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <cpu mode='custom' match='exact' check='full'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <model fallback='forbid'>EPYC-Rome</model>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <vendor>AMD</vendor>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='require' name='x2apic'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='require' name='tsc-deadline'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='require' name='hypervisor'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='require' name='tsc_adjust'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='require' name='spec-ctrl'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='require' name='stibp'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='require' name='arch-capabilities'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='require' name='ssbd'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='require' name='cmp_legacy'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='require' name='overflow-recov'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='require' name='succor'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='require' name='ibrs'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='require' name='amd-ssbd'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='require' name='virt-ssbd'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='disable' name='lbrv'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='disable' name='tsc-scale'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='disable' name='vmcb-clean'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='disable' name='flushbyasid'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='disable' name='pause-filter'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='disable' name='pfthreshold'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='disable' name='svme-addr-chk'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='require' name='lfence-always-serializing'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='require' name='rdctl-no'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='require' name='mds-no'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='require' name='pschange-mc-no'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='require' name='gds-no'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='require' name='rfds-no'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='disable' name='xsaves'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='disable' name='svm'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='require' name='topoext'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='disable' name='npt'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='disable' name='nrip-save'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   </cpu>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <clock offset='utc'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <timer name='pit' tickpolicy='delay'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <timer name='rtc' tickpolicy='catchup'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <timer name='hpet' present='no'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   </clock>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <on_poweroff>destroy</on_poweroff>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <on_reboot>restart</on_reboot>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <on_crash>destroy</on_crash>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <devices>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <disk type='file' device='disk'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <driver name='qemu' type='qcow2' cache='none'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <source file='/var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/disk' index='2'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <backingStore type='file' index='3'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:         <format type='raw'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:         <source file='/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:         <backingStore/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       </backingStore>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target dev='vda' bus='virtio'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='virtio-disk0'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </disk>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <disk type='file' device='cdrom'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <driver name='qemu' type='raw' cache='none'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <source file='/var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/disk.config' index='1'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <backingStore/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target dev='sda' bus='sata'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <readonly/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='sata0-0-0'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </disk>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='pci' index='0' model='pcie-root'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='pcie.0'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='pci' index='1' model='pcie-root-port'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target chassis='1' port='0x10'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='pci.1'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='pci' index='2' model='pcie-root-port'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target chassis='2' port='0x11'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='pci.2'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='pci' index='3' model='pcie-root-port'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target chassis='3' port='0x12'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='pci.3'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='pci' index='4' model='pcie-root-port'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target chassis='4' port='0x13'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='pci.4'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='pci' index='5' model='pcie-root-port'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target chassis='5' port='0x14'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='pci.5'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='pci' index='6' model='pcie-root-port'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target chassis='6' port='0x15'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='pci.6'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='pci' index='7' model='pcie-root-port'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target chassis='7' port='0x16'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='pci.7'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='pci' index='8' model='pcie-root-port'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target chassis='8' port='0x17'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='pci.8'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='pci' index='9' model='pcie-root-port'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target chassis='9' port='0x18'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='pci.9'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='pci' index='10' model='pcie-root-port'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target chassis='10' port='0x19'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='pci.10'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='pci' index='11' model='pcie-root-port'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target chassis='11' port='0x1a'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='pci.11'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='pci' index='12' model='pcie-root-port'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target chassis='12' port='0x1b'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='pci.12'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='pci' index='13' model='pcie-root-port'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target chassis='13' port='0x1c'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='pci.13'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='pci' index='14' model='pcie-root-port'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target chassis='14' port='0x1d'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='pci.14'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='pci' index='15' model='pcie-root-port'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target chassis='15' port='0x1e'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='pci.15'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='pci' index='16' model='pcie-root-port'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target chassis='16' port='0x1f'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='pci.16'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='pci' index='17' model='pcie-root-port'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target chassis='17' port='0x20'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='pci.17'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='pci' index='18' model='pcie-root-port'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target chassis='18' port='0x21'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='pci.18'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='pci' index='19' model='pcie-root-port'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target chassis='19' port='0x22'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='pci.19'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='pci' index='20' model='pcie-root-port'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target chassis='20' port='0x23'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='pci.20'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='pci' index='21' model='pcie-root-port'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target chassis='21' port='0x24'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='pci.21'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='pci' index='22' model='pcie-root-port'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target chassis='22' port='0x25'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='pci.22'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='pci' index='23' model='pcie-root-port'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target chassis='23' port='0x26'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='pci.23'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='pci' index='24' model='pcie-root-port'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target chassis='24' port='0x27'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='pci.24'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='pci' index='25' model='pcie-root-port'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target chassis='25' port='0x28'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='pci.25'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model name='pcie-pci-bridge'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='pci.26'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='usb' index='0' model='piix3-uhci'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='usb'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='sata' index='0'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='ide'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <interface type='ethernet'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <mac address='fa:16:3e:d7:63:9d'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target dev='tap0107be0e-1b'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model type='virtio'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <driver name='vhost' rx_queue_size='512'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <mtu size='1442'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='net0'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </interface>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <interface type='ethernet'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <mac address='fa:16:3e:bc:a5:e4'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target dev='tap6943627d-66'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model type='virtio'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <driver name='vhost' rx_queue_size='512'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <mtu size='1442'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='net1'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </interface>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <serial type='pty'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <source path='/dev/pts/0'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <log file='/var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/console.log' append='off'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target type='isa-serial' port='0'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:         <model name='isa-serial'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       </target>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='serial0'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </serial>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <console type='pty' tty='/dev/pts/0'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <source path='/dev/pts/0'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <log file='/var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/console.log' append='off'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target type='serial' port='0'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='serial0'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </console>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <input type='tablet' bus='usb'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='input0'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='usb' bus='0' port='1'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </input>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <input type='mouse' bus='ps2'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='input1'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </input>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <input type='keyboard' bus='ps2'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='input2'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </input>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <listen type='address' address='::0'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </graphics>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <audio id='1' type='none'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <video>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model type='virtio' heads='1' primary='yes'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='video0'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </video>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <watchdog model='itco' action='reset'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='watchdog0'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </watchdog>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <memballoon model='virtio'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <stats period='10'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='balloon0'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </memballoon>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <rng model='virtio'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <backend model='random'>/dev/urandom</backend>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='rng0'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </rng>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   </devices>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <label>system_u:system_r:svirt_t:s0:c277,c815</label>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c277,c815</imagelabel>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   </seclabel>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <label>+107:+107</label>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <imagelabel>+107:+107</imagelabel>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   </seclabel>
Oct 08 19:08:01 compute-0 nova_compute[117514]: </domain>
Oct 08 19:08:01 compute-0 nova_compute[117514]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Oct 08 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.519 2 INFO nova.virt.libvirt.driver [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully detached device tap6943627d-66 from instance b66b330b-1cad-4dfb-a2f9-83201dc8ee32 from the persistent domain config.
Oct 08 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.520 2 DEBUG nova.virt.libvirt.driver [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] (1/8): Attempting to detach device tap6943627d-66 with device alias net1 from instance b66b330b-1cad-4dfb-a2f9-83201dc8ee32 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523
Oct 08 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.520 2 DEBUG nova.virt.libvirt.guest [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] detach device xml: <interface type="ethernet">
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <mac address="fa:16:3e:bc:a5:e4"/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <model type="virtio"/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <driver name="vhost" rx_queue_size="512"/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <mtu size="1442"/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <target dev="tap6943627d-66"/>
Oct 08 19:08:01 compute-0 nova_compute[117514]: </interface>
Oct 08 19:08:01 compute-0 nova_compute[117514]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Oct 08 19:08:01 compute-0 kernel: tap6943627d-66 (unregistering): left promiscuous mode
Oct 08 19:08:01 compute-0 NetworkManager[1035]: <info>  [1759950481.5881] device (tap6943627d-66): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 08 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.596 2 DEBUG nova.virt.libvirt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Received event <DeviceRemovedEvent: 1759950481.5965085, b66b330b-1cad-4dfb-a2f9-83201dc8ee32 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370
Oct 08 19:08:01 compute-0 ovn_controller[19759]: 2025-10-08T19:08:01Z|00058|binding|INFO|Releasing lport 6943627d-6614-41cb-9460-f0454c6defb1 from this chassis (sb_readonly=0)
Oct 08 19:08:01 compute-0 ovn_controller[19759]: 2025-10-08T19:08:01Z|00059|binding|INFO|Setting lport 6943627d-6614-41cb-9460-f0454c6defb1 down in Southbound
Oct 08 19:08:01 compute-0 ovn_controller[19759]: 2025-10-08T19:08:01Z|00060|binding|INFO|Removing iface tap6943627d-66 ovn-installed in OVS
Oct 08 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.599 2 DEBUG nova.virt.libvirt.driver [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Start waiting for the detach event from libvirt for device tap6943627d-66 with device alias net1 for instance b66b330b-1cad-4dfb-a2f9-83201dc8ee32 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599
Oct 08 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.599 2 DEBUG nova.virt.libvirt.guest [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:bc:a5:e4"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap6943627d-66"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Oct 08 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.600 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:01.609 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bc:a5:e4 10.100.0.29'], port_security=['fa:16:3e:bc:a5:e4 10.100.0.29'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.29/28', 'neutron:device_id': 'b66b330b-1cad-4dfb-a2f9-83201dc8ee32', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c73d9547-8a91-4802-82a8-1a3a035fe63c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'be57f10c-6afc-483d-a1fa-fab953b8fe3e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4df9aed3-d2c0-400e-9a01-f8aebdd77f61, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=6943627d-6614-41cb-9460-f0454c6defb1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 08 19:08:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:01.611 28643 INFO neutron.agent.ovn.metadata.agent [-] Port 6943627d-6614-41cb-9460-f0454c6defb1 in datapath c73d9547-8a91-4802-82a8-1a3a035fe63c unbound from our chassis
Oct 08 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.606 2 DEBUG nova.virt.libvirt.guest [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:bc:a5:e4"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap6943627d-66"/></interface>not found in domain: <domain type='kvm' id='3'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <name>instance-00000003</name>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <uuid>b66b330b-1cad-4dfb-a2f9-83201dc8ee32</uuid>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <metadata>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <nova:name>tempest-TestNetworkBasicOps-server-602516393</nova:name>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <nova:creationTime>2025-10-08 19:07:59</nova:creationTime>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <nova:flavor name="m1.nano">
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <nova:memory>128</nova:memory>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <nova:disk>1</nova:disk>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <nova:swap>0</nova:swap>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <nova:ephemeral>0</nova:ephemeral>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <nova:vcpus>1</nova:vcpus>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   </nova:flavor>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <nova:owner>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <nova:user uuid="efdb1424acdb478684cdb088b373ba05">tempest-TestNetworkBasicOps-1122149477-project-member</nova:user>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <nova:project uuid="b7f7c752a9c5498f8eda73e461895ac9">tempest-TestNetworkBasicOps-1122149477</nova:project>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   </nova:owner>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <nova:root type="image" uuid="23cfa426-7011-4566-992d-1c7af39f70dd"/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <nova:ports>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <nova:port uuid="0107be0e-1b4b-47dd-9422-a435ded0964c">
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </nova:port>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <nova:port uuid="6943627d-6614-41cb-9460-f0454c6defb1">
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <nova:ip type="fixed" address="10.100.0.29" ipVersion="4"/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </nova:port>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   </nova:ports>
Oct 08 19:08:01 compute-0 nova_compute[117514]: </nova:instance>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   </metadata>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <memory unit='KiB'>131072</memory>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <currentMemory unit='KiB'>131072</currentMemory>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <vcpu placement='static'>1</vcpu>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <resource>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <partition>/machine</partition>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   </resource>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <sysinfo type='smbios'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <system>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <entry name='manufacturer'>RDO</entry>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <entry name='product'>OpenStack Compute</entry>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <entry name='serial'>b66b330b-1cad-4dfb-a2f9-83201dc8ee32</entry>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <entry name='uuid'>b66b330b-1cad-4dfb-a2f9-83201dc8ee32</entry>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <entry name='family'>Virtual Machine</entry>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </system>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   </sysinfo>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <os>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <type arch='x86_64' machine='pc-q35-rhel9.6.0'>hvm</type>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <boot dev='hd'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <smbios mode='sysinfo'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   </os>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <features>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <acpi/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <apic/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <vmcoreinfo state='on'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   </features>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <cpu mode='custom' match='exact' check='full'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <model fallback='forbid'>EPYC-Rome</model>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <vendor>AMD</vendor>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='require' name='x2apic'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='require' name='tsc-deadline'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='require' name='hypervisor'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='require' name='tsc_adjust'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='require' name='spec-ctrl'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='require' name='stibp'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='require' name='arch-capabilities'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='require' name='ssbd'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='require' name='cmp_legacy'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='require' name='overflow-recov'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='require' name='succor'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='require' name='ibrs'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='require' name='amd-ssbd'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='require' name='virt-ssbd'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='disable' name='lbrv'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='disable' name='tsc-scale'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='disable' name='vmcb-clean'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='disable' name='flushbyasid'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='disable' name='pause-filter'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='disable' name='pfthreshold'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='disable' name='svme-addr-chk'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='require' name='lfence-always-serializing'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='require' name='rdctl-no'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='require' name='mds-no'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='require' name='pschange-mc-no'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='require' name='gds-no'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='require' name='rfds-no'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='disable' name='xsaves'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='disable' name='svm'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='require' name='topoext'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='disable' name='npt'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <feature policy='disable' name='nrip-save'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   </cpu>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <clock offset='utc'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <timer name='pit' tickpolicy='delay'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <timer name='rtc' tickpolicy='catchup'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <timer name='hpet' present='no'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   </clock>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <on_poweroff>destroy</on_poweroff>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <on_reboot>restart</on_reboot>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <on_crash>destroy</on_crash>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <devices>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <disk type='file' device='disk'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <driver name='qemu' type='qcow2' cache='none'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <source file='/var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/disk' index='2'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <backingStore type='file' index='3'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:         <format type='raw'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:         <source file='/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:         <backingStore/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       </backingStore>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target dev='vda' bus='virtio'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='virtio-disk0'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </disk>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <disk type='file' device='cdrom'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <driver name='qemu' type='raw' cache='none'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <source file='/var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/disk.config' index='1'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <backingStore/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target dev='sda' bus='sata'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <readonly/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='sata0-0-0'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </disk>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='pci' index='0' model='pcie-root'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='pcie.0'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='pci' index='1' model='pcie-root-port'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target chassis='1' port='0x10'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='pci.1'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='pci' index='2' model='pcie-root-port'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target chassis='2' port='0x11'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='pci.2'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='pci' index='3' model='pcie-root-port'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target chassis='3' port='0x12'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='pci.3'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='pci' index='4' model='pcie-root-port'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target chassis='4' port='0x13'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='pci.4'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='pci' index='5' model='pcie-root-port'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target chassis='5' port='0x14'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='pci.5'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='pci' index='6' model='pcie-root-port'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target chassis='6' port='0x15'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='pci.6'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='pci' index='7' model='pcie-root-port'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target chassis='7' port='0x16'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='pci.7'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='pci' index='8' model='pcie-root-port'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target chassis='8' port='0x17'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='pci.8'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='pci' index='9' model='pcie-root-port'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target chassis='9' port='0x18'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='pci.9'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='pci' index='10' model='pcie-root-port'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target chassis='10' port='0x19'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='pci.10'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='pci' index='11' model='pcie-root-port'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target chassis='11' port='0x1a'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='pci.11'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='pci' index='12' model='pcie-root-port'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target chassis='12' port='0x1b'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='pci.12'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='pci' index='13' model='pcie-root-port'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target chassis='13' port='0x1c'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='pci.13'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='pci' index='14' model='pcie-root-port'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target chassis='14' port='0x1d'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='pci.14'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='pci' index='15' model='pcie-root-port'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target chassis='15' port='0x1e'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='pci.15'/>
Oct 08 19:08:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:01.613 28643 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c73d9547-8a91-4802-82a8-1a3a035fe63c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='pci' index='16' model='pcie-root-port'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target chassis='16' port='0x1f'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='pci.16'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='pci' index='17' model='pcie-root-port'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target chassis='17' port='0x20'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='pci.17'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='pci' index='18' model='pcie-root-port'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target chassis='18' port='0x21'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='pci.18'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='pci' index='19' model='pcie-root-port'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target chassis='19' port='0x22'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='pci.19'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='pci' index='20' model='pcie-root-port'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target chassis='20' port='0x23'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='pci.20'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='pci' index='21' model='pcie-root-port'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target chassis='21' port='0x24'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='pci.21'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='pci' index='22' model='pcie-root-port'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target chassis='22' port='0x25'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='pci.22'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='pci' index='23' model='pcie-root-port'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target chassis='23' port='0x26'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='pci.23'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='pci' index='24' model='pcie-root-port'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target chassis='24' port='0x27'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='pci.24'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='pci' index='25' model='pcie-root-port'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target chassis='25' port='0x28'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='pci.25'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model name='pcie-pci-bridge'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='pci.26'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='usb' index='0' model='piix3-uhci'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='usb'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <controller type='sata' index='0'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='ide'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <interface type='ethernet'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <mac address='fa:16:3e:d7:63:9d'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target dev='tap0107be0e-1b'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model type='virtio'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <driver name='vhost' rx_queue_size='512'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <mtu size='1442'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='net0'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </interface>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <serial type='pty'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <source path='/dev/pts/0'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <log file='/var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/console.log' append='off'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target type='isa-serial' port='0'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:         <model name='isa-serial'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       </target>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='serial0'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </serial>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <console type='pty' tty='/dev/pts/0'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <source path='/dev/pts/0'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <log file='/var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/console.log' append='off'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <target type='serial' port='0'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='serial0'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </console>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <input type='tablet' bus='usb'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='input0'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='usb' bus='0' port='1'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </input>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <input type='mouse' bus='ps2'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='input1'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </input>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <input type='keyboard' bus='ps2'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='input2'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </input>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <listen type='address' address='::0'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </graphics>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <audio id='1' type='none'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <video>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <model type='virtio' heads='1' primary='yes'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='video0'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </video>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <watchdog model='itco' action='reset'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='watchdog0'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </watchdog>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <memballoon model='virtio'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <stats period='10'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='balloon0'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </memballoon>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <rng model='virtio'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <backend model='random'>/dev/urandom</backend>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <alias name='rng0'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </rng>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   </devices>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <label>system_u:system_r:svirt_t:s0:c277,c815</label>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c277,c815</imagelabel>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   </seclabel>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <label>+107:+107</label>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <imagelabel>+107:+107</imagelabel>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   </seclabel>
Oct 08 19:08:01 compute-0 nova_compute[117514]: </domain>
Oct 08 19:08:01 compute-0 nova_compute[117514]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Oct 08 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.607 2 INFO nova.virt.libvirt.driver [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully detached device tap6943627d-66 from instance b66b330b-1cad-4dfb-a2f9-83201dc8ee32 from the live domain config.
Oct 08 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.608 2 DEBUG nova.virt.libvirt.vif [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T19:07:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-602516393',display_name='tempest-TestNetworkBasicOps-server-602516393',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-602516393',id=3,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC1aTvBPkgf1VfNjvC4uuKKg+tISnXImijmvs2cGp+FtgeRrvxYh9lBxLRU9xSzH0Z6LaCabBaf6NwgK+eU8uEwumcvsX4qsd2EcbV6VjIknh+8LBbcMTdQeQSSFJx6qhQ==',key_name='tempest-TestNetworkBasicOps-1029193278',keypairs=<?>,launch_index=0,launched_at=2025-10-08T19:07:32Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-dwebwbaf',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T19:07:32Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=b66b330b-1cad-4dfb-a2f9-83201dc8ee32,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6943627d-6614-41cb-9460-f0454c6defb1", "address": "fa:16:3e:bc:a5:e4", "network": {"id": "c73d9547-8a91-4802-82a8-1a3a035fe63c", "bridge": "br-int", "label": "tempest-network-smoke--833981410", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6943627d-66", "ovs_interfaceid": "6943627d-6614-41cb-9460-f0454c6defb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Oct 08 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.608 2 DEBUG nova.network.os_vif_util [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "6943627d-6614-41cb-9460-f0454c6defb1", "address": "fa:16:3e:bc:a5:e4", "network": {"id": "c73d9547-8a91-4802-82a8-1a3a035fe63c", "bridge": "br-int", "label": "tempest-network-smoke--833981410", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6943627d-66", "ovs_interfaceid": "6943627d-6614-41cb-9460-f0454c6defb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 08 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.608 2 DEBUG nova.network.os_vif_util [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bc:a5:e4,bridge_name='br-int',has_traffic_filtering=True,id=6943627d-6614-41cb-9460-f0454c6defb1,network=Network(c73d9547-8a91-4802-82a8-1a3a035fe63c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6943627d-66') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 08 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.609 2 DEBUG os_vif [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bc:a5:e4,bridge_name='br-int',has_traffic_filtering=True,id=6943627d-6614-41cb-9460-f0454c6defb1,network=Network(c73d9547-8a91-4802-82a8-1a3a035fe63c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6943627d-66') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Oct 08 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.610 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.610 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6943627d-66, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.611 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.613 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 08 19:08:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:01.614 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[68fc4b23-3161-44cc-8087-3db7e67f3382]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:08:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:01.615 28643 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c73d9547-8a91-4802-82a8-1a3a035fe63c namespace which is not needed anymore
Oct 08 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.623 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.628 2 INFO os_vif [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bc:a5:e4,bridge_name='br-int',has_traffic_filtering=True,id=6943627d-6614-41cb-9460-f0454c6defb1,network=Network(c73d9547-8a91-4802-82a8-1a3a035fe63c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6943627d-66')
Oct 08 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.629 2 DEBUG nova.virt.libvirt.guest [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <nova:name>tempest-TestNetworkBasicOps-server-602516393</nova:name>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <nova:creationTime>2025-10-08 19:08:01</nova:creationTime>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <nova:flavor name="m1.nano">
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <nova:memory>128</nova:memory>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <nova:disk>1</nova:disk>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <nova:swap>0</nova:swap>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <nova:ephemeral>0</nova:ephemeral>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <nova:vcpus>1</nova:vcpus>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   </nova:flavor>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <nova:owner>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <nova:user uuid="efdb1424acdb478684cdb088b373ba05">tempest-TestNetworkBasicOps-1122149477-project-member</nova:user>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <nova:project uuid="b7f7c752a9c5498f8eda73e461895ac9">tempest-TestNetworkBasicOps-1122149477</nova:project>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   </nova:owner>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <nova:root type="image" uuid="23cfa426-7011-4566-992d-1c7af39f70dd"/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   <nova:ports>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     <nova:port uuid="0107be0e-1b4b-47dd-9422-a435ded0964c">
Oct 08 19:08:01 compute-0 nova_compute[117514]:       <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct 08 19:08:01 compute-0 nova_compute[117514]:     </nova:port>
Oct 08 19:08:01 compute-0 nova_compute[117514]:   </nova:ports>
Oct 08 19:08:01 compute-0 nova_compute[117514]: </nova:instance>
Oct 08 19:08:01 compute-0 nova_compute[117514]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Oct 08 19:08:01 compute-0 neutron-haproxy-ovnmeta-c73d9547-8a91-4802-82a8-1a3a035fe63c[146137]: [NOTICE]   (146141) : haproxy version is 2.8.14-c23fe91
Oct 08 19:08:01 compute-0 neutron-haproxy-ovnmeta-c73d9547-8a91-4802-82a8-1a3a035fe63c[146137]: [NOTICE]   (146141) : path to executable is /usr/sbin/haproxy
Oct 08 19:08:01 compute-0 neutron-haproxy-ovnmeta-c73d9547-8a91-4802-82a8-1a3a035fe63c[146137]: [WARNING]  (146141) : Exiting Master process...
Oct 08 19:08:01 compute-0 neutron-haproxy-ovnmeta-c73d9547-8a91-4802-82a8-1a3a035fe63c[146137]: [WARNING]  (146141) : Exiting Master process...
Oct 08 19:08:01 compute-0 neutron-haproxy-ovnmeta-c73d9547-8a91-4802-82a8-1a3a035fe63c[146137]: [ALERT]    (146141) : Current worker (146143) exited with code 143 (Terminated)
Oct 08 19:08:01 compute-0 neutron-haproxy-ovnmeta-c73d9547-8a91-4802-82a8-1a3a035fe63c[146137]: [WARNING]  (146141) : All workers exited. Exiting... (0)
Oct 08 19:08:01 compute-0 systemd[1]: libpod-e8c6faf61c8db69d560312908f590b9785d8039ff6bc8873fa27b59c76a83637.scope: Deactivated successfully.
Oct 08 19:08:01 compute-0 podman[146173]: 2025-10-08 19:08:01.777604952 +0000 UTC m=+0.042460126 container died e8c6faf61c8db69d560312908f590b9785d8039ff6bc8873fa27b59c76a83637 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-c73d9547-8a91-4802-82a8-1a3a035fe63c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Oct 08 19:08:01 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e8c6faf61c8db69d560312908f590b9785d8039ff6bc8873fa27b59c76a83637-userdata-shm.mount: Deactivated successfully.
Oct 08 19:08:01 compute-0 systemd[1]: var-lib-containers-storage-overlay-274f69a4d16ffcd141ca0c2aabf5d247e94ba26d928301563c5c3be5cc17c132-merged.mount: Deactivated successfully.
Oct 08 19:08:01 compute-0 podman[146173]: 2025-10-08 19:08:01.814851161 +0000 UTC m=+0.079706345 container cleanup e8c6faf61c8db69d560312908f590b9785d8039ff6bc8873fa27b59c76a83637 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-c73d9547-8a91-4802-82a8-1a3a035fe63c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3)
Oct 08 19:08:01 compute-0 systemd[1]: libpod-conmon-e8c6faf61c8db69d560312908f590b9785d8039ff6bc8873fa27b59c76a83637.scope: Deactivated successfully.
Oct 08 19:08:01 compute-0 podman[146211]: 2025-10-08 19:08:01.872341417 +0000 UTC m=+0.036351702 container remove e8c6faf61c8db69d560312908f590b9785d8039ff6bc8873fa27b59c76a83637 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-c73d9547-8a91-4802-82a8-1a3a035fe63c, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 08 19:08:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:01.879 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[73371fd9-02e3-4551-a7fb-50b56767626c]: (4, ('Wed Oct  8 07:08:01 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-c73d9547-8a91-4802-82a8-1a3a035fe63c (e8c6faf61c8db69d560312908f590b9785d8039ff6bc8873fa27b59c76a83637)\ne8c6faf61c8db69d560312908f590b9785d8039ff6bc8873fa27b59c76a83637\nWed Oct  8 07:08:01 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c73d9547-8a91-4802-82a8-1a3a035fe63c (e8c6faf61c8db69d560312908f590b9785d8039ff6bc8873fa27b59c76a83637)\ne8c6faf61c8db69d560312908f590b9785d8039ff6bc8873fa27b59c76a83637\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:08:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:01.881 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[09382b5c-2f24-4a4d-8609-c77914f14cea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:08:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:01.883 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc73d9547-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:08:01 compute-0 kernel: tapc73d9547-80: left promiscuous mode
Oct 08 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.908 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:01.911 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[aedf8744-deff-404f-b1ad-a21929c862c0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:08:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:01.938 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[7bef5564-83be-4906-89eb-4a45adde8950]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:08:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:01.940 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[1ae5991c-fefe-4d12-bdd4-a2c06f98f019]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:08:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:01.954 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[b4f94b68-5b78-422f-8cd4-6a94bba88f3d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 115031, 'reachable_time': 34555, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 146226, 'error': None, 'target': 'ovnmeta-c73d9547-8a91-4802-82a8-1a3a035fe63c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:08:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:01.956 28783 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c73d9547-8a91-4802-82a8-1a3a035fe63c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Oct 08 19:08:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:01.956 28783 DEBUG oslo.privsep.daemon [-] privsep: reply[bf558c68-a6d0-45cc-a64b-b5bcc502b6c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:08:01 compute-0 systemd[1]: run-netns-ovnmeta\x2dc73d9547\x2d8a91\x2d4802\x2d82a8\x2d1a3a035fe63c.mount: Deactivated successfully.
Oct 08 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.309 2 DEBUG oslo_concurrency.lockutils [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "refresh_cache-b66b330b-1cad-4dfb-a2f9-83201dc8ee32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.310 2 DEBUG oslo_concurrency.lockutils [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquired lock "refresh_cache-b66b330b-1cad-4dfb-a2f9-83201dc8ee32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.310 2 DEBUG nova.network.neutron [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 08 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.351 2 DEBUG nova.compute.manager [req-f11bb41b-8d96-45c5-b57a-bd4afd2f953b req-72ef385b-bc24-4646-befc-6f649d5ccd73 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Received event network-vif-deleted-6943627d-6614-41cb-9460-f0454c6defb1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.351 2 INFO nova.compute.manager [req-f11bb41b-8d96-45c5-b57a-bd4afd2f953b req-72ef385b-bc24-4646-befc-6f649d5ccd73 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Neutron deleted interface 6943627d-6614-41cb-9460-f0454c6defb1; detaching it from the instance and deleting it from the info cache
Oct 08 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.352 2 DEBUG nova.network.neutron [req-f11bb41b-8d96-45c5-b57a-bd4afd2f953b req-72ef385b-bc24-4646-befc-6f649d5ccd73 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Updating instance_info_cache with network_info: [{"id": "0107be0e-1b4b-47dd-9422-a435ded0964c", "address": "fa:16:3e:d7:63:9d", "network": {"id": "15690acb-54cf-4081-a718-c14a1c0af6a8", "bridge": "br-int", "label": "tempest-network-smoke--977169033", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0107be0e-1b", "ovs_interfaceid": "0107be0e-1b4b-47dd-9422-a435ded0964c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.374 2 DEBUG nova.objects.instance [req-f11bb41b-8d96-45c5-b57a-bd4afd2f953b req-72ef385b-bc24-4646-befc-6f649d5ccd73 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lazy-loading 'system_metadata' on Instance uuid b66b330b-1cad-4dfb-a2f9-83201dc8ee32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 08 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.404 2 DEBUG nova.compute.manager [req-7117c193-3715-4b74-80b2-4831a0067785 req-7ac74961-b315-47f6-9b35-dd563793e8df bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Received event network-vif-plugged-6943627d-6614-41cb-9460-f0454c6defb1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.405 2 DEBUG oslo_concurrency.lockutils [req-7117c193-3715-4b74-80b2-4831a0067785 req-7ac74961-b315-47f6-9b35-dd563793e8df bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.406 2 DEBUG oslo_concurrency.lockutils [req-7117c193-3715-4b74-80b2-4831a0067785 req-7ac74961-b315-47f6-9b35-dd563793e8df bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.406 2 DEBUG oslo_concurrency.lockutils [req-7117c193-3715-4b74-80b2-4831a0067785 req-7ac74961-b315-47f6-9b35-dd563793e8df bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.406 2 DEBUG nova.compute.manager [req-7117c193-3715-4b74-80b2-4831a0067785 req-7ac74961-b315-47f6-9b35-dd563793e8df bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] No waiting events found dispatching network-vif-plugged-6943627d-6614-41cb-9460-f0454c6defb1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 08 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.406 2 WARNING nova.compute.manager [req-7117c193-3715-4b74-80b2-4831a0067785 req-7ac74961-b315-47f6-9b35-dd563793e8df bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Received unexpected event network-vif-plugged-6943627d-6614-41cb-9460-f0454c6defb1 for instance with vm_state active and task_state None.
Oct 08 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.407 2 DEBUG nova.compute.manager [req-7117c193-3715-4b74-80b2-4831a0067785 req-7ac74961-b315-47f6-9b35-dd563793e8df bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Received event network-vif-unplugged-6943627d-6614-41cb-9460-f0454c6defb1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.407 2 DEBUG oslo_concurrency.lockutils [req-7117c193-3715-4b74-80b2-4831a0067785 req-7ac74961-b315-47f6-9b35-dd563793e8df bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.407 2 DEBUG oslo_concurrency.lockutils [req-7117c193-3715-4b74-80b2-4831a0067785 req-7ac74961-b315-47f6-9b35-dd563793e8df bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.408 2 DEBUG oslo_concurrency.lockutils [req-7117c193-3715-4b74-80b2-4831a0067785 req-7ac74961-b315-47f6-9b35-dd563793e8df bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.408 2 DEBUG nova.compute.manager [req-7117c193-3715-4b74-80b2-4831a0067785 req-7ac74961-b315-47f6-9b35-dd563793e8df bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] No waiting events found dispatching network-vif-unplugged-6943627d-6614-41cb-9460-f0454c6defb1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 08 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.408 2 WARNING nova.compute.manager [req-7117c193-3715-4b74-80b2-4831a0067785 req-7ac74961-b315-47f6-9b35-dd563793e8df bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Received unexpected event network-vif-unplugged-6943627d-6614-41cb-9460-f0454c6defb1 for instance with vm_state active and task_state None.
Oct 08 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.408 2 DEBUG nova.compute.manager [req-7117c193-3715-4b74-80b2-4831a0067785 req-7ac74961-b315-47f6-9b35-dd563793e8df bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Received event network-vif-plugged-6943627d-6614-41cb-9460-f0454c6defb1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.409 2 DEBUG oslo_concurrency.lockutils [req-7117c193-3715-4b74-80b2-4831a0067785 req-7ac74961-b315-47f6-9b35-dd563793e8df bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.409 2 DEBUG oslo_concurrency.lockutils [req-7117c193-3715-4b74-80b2-4831a0067785 req-7ac74961-b315-47f6-9b35-dd563793e8df bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.409 2 DEBUG oslo_concurrency.lockutils [req-7117c193-3715-4b74-80b2-4831a0067785 req-7ac74961-b315-47f6-9b35-dd563793e8df bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.409 2 DEBUG nova.compute.manager [req-7117c193-3715-4b74-80b2-4831a0067785 req-7ac74961-b315-47f6-9b35-dd563793e8df bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] No waiting events found dispatching network-vif-plugged-6943627d-6614-41cb-9460-f0454c6defb1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 08 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.410 2 WARNING nova.compute.manager [req-7117c193-3715-4b74-80b2-4831a0067785 req-7ac74961-b315-47f6-9b35-dd563793e8df bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Received unexpected event network-vif-plugged-6943627d-6614-41cb-9460-f0454c6defb1 for instance with vm_state active and task_state None.
Oct 08 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.414 2 DEBUG nova.objects.instance [req-f11bb41b-8d96-45c5-b57a-bd4afd2f953b req-72ef385b-bc24-4646-befc-6f649d5ccd73 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lazy-loading 'flavor' on Instance uuid b66b330b-1cad-4dfb-a2f9-83201dc8ee32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 08 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.455 2 DEBUG nova.virt.libvirt.vif [req-f11bb41b-8d96-45c5-b57a-bd4afd2f953b req-72ef385b-bc24-4646-befc-6f649d5ccd73 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T19:07:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-602516393',display_name='tempest-TestNetworkBasicOps-server-602516393',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-602516393',id=3,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC1aTvBPkgf1VfNjvC4uuKKg+tISnXImijmvs2cGp+FtgeRrvxYh9lBxLRU9xSzH0Z6LaCabBaf6NwgK+eU8uEwumcvsX4qsd2EcbV6VjIknh+8LBbcMTdQeQSSFJx6qhQ==',key_name='tempest-TestNetworkBasicOps-1029193278',keypairs=<?>,launch_index=0,launched_at=2025-10-08T19:07:32Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-dwebwbaf',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T19:07:32Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=b66b330b-1cad-4dfb-a2f9-83201dc8ee32,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6943627d-6614-41cb-9460-f0454c6defb1", "address": "fa:16:3e:bc:a5:e4", "network": {"id": "c73d9547-8a91-4802-82a8-1a3a035fe63c", "bridge": "br-int", "label": "tempest-network-smoke--833981410", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6943627d-66", "ovs_interfaceid": "6943627d-6614-41cb-9460-f0454c6defb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Oct 08 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.455 2 DEBUG nova.network.os_vif_util [req-f11bb41b-8d96-45c5-b57a-bd4afd2f953b req-72ef385b-bc24-4646-befc-6f649d5ccd73 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Converting VIF {"id": "6943627d-6614-41cb-9460-f0454c6defb1", "address": "fa:16:3e:bc:a5:e4", "network": {"id": "c73d9547-8a91-4802-82a8-1a3a035fe63c", "bridge": "br-int", "label": "tempest-network-smoke--833981410", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6943627d-66", "ovs_interfaceid": "6943627d-6614-41cb-9460-f0454c6defb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 08 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.456 2 DEBUG nova.network.os_vif_util [req-f11bb41b-8d96-45c5-b57a-bd4afd2f953b req-72ef385b-bc24-4646-befc-6f649d5ccd73 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bc:a5:e4,bridge_name='br-int',has_traffic_filtering=True,id=6943627d-6614-41cb-9460-f0454c6defb1,network=Network(c73d9547-8a91-4802-82a8-1a3a035fe63c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6943627d-66') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 08 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.460 2 DEBUG nova.virt.libvirt.guest [req-f11bb41b-8d96-45c5-b57a-bd4afd2f953b req-72ef385b-bc24-4646-befc-6f649d5ccd73 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:bc:a5:e4"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap6943627d-66"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Oct 08 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.467 2 DEBUG nova.virt.libvirt.guest [req-f11bb41b-8d96-45c5-b57a-bd4afd2f953b req-72ef385b-bc24-4646-befc-6f649d5ccd73 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:bc:a5:e4"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap6943627d-66"/></interface>not found in domain: <domain type='kvm' id='3'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <name>instance-00000003</name>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <uuid>b66b330b-1cad-4dfb-a2f9-83201dc8ee32</uuid>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <metadata>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <nova:name>tempest-TestNetworkBasicOps-server-602516393</nova:name>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <nova:creationTime>2025-10-08 19:08:01</nova:creationTime>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <nova:flavor name="m1.nano">
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <nova:memory>128</nova:memory>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <nova:disk>1</nova:disk>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <nova:swap>0</nova:swap>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <nova:ephemeral>0</nova:ephemeral>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <nova:vcpus>1</nova:vcpus>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   </nova:flavor>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <nova:owner>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <nova:user uuid="efdb1424acdb478684cdb088b373ba05">tempest-TestNetworkBasicOps-1122149477-project-member</nova:user>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <nova:project uuid="b7f7c752a9c5498f8eda73e461895ac9">tempest-TestNetworkBasicOps-1122149477</nova:project>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   </nova:owner>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <nova:root type="image" uuid="23cfa426-7011-4566-992d-1c7af39f70dd"/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <nova:ports>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <nova:port uuid="0107be0e-1b4b-47dd-9422-a435ded0964c">
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </nova:port>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   </nova:ports>
Oct 08 19:08:02 compute-0 nova_compute[117514]: </nova:instance>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   </metadata>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <memory unit='KiB'>131072</memory>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <currentMemory unit='KiB'>131072</currentMemory>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <vcpu placement='static'>1</vcpu>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <resource>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <partition>/machine</partition>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   </resource>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <sysinfo type='smbios'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <system>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <entry name='manufacturer'>RDO</entry>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <entry name='product'>OpenStack Compute</entry>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <entry name='serial'>b66b330b-1cad-4dfb-a2f9-83201dc8ee32</entry>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <entry name='uuid'>b66b330b-1cad-4dfb-a2f9-83201dc8ee32</entry>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <entry name='family'>Virtual Machine</entry>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </system>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   </sysinfo>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <os>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <type arch='x86_64' machine='pc-q35-rhel9.6.0'>hvm</type>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <boot dev='hd'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <smbios mode='sysinfo'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   </os>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <features>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <acpi/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <apic/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <vmcoreinfo state='on'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   </features>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <cpu mode='custom' match='exact' check='full'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <model fallback='forbid'>EPYC-Rome</model>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <vendor>AMD</vendor>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='require' name='x2apic'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='require' name='tsc-deadline'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='require' name='hypervisor'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='require' name='tsc_adjust'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='require' name='spec-ctrl'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='require' name='stibp'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='require' name='arch-capabilities'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='require' name='ssbd'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='require' name='cmp_legacy'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='require' name='overflow-recov'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='require' name='succor'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='require' name='ibrs'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='require' name='amd-ssbd'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='require' name='virt-ssbd'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='disable' name='lbrv'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='disable' name='tsc-scale'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='disable' name='vmcb-clean'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='disable' name='flushbyasid'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='disable' name='pause-filter'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='disable' name='pfthreshold'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='disable' name='svme-addr-chk'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='require' name='lfence-always-serializing'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='require' name='rdctl-no'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='require' name='mds-no'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='require' name='pschange-mc-no'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='require' name='gds-no'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='require' name='rfds-no'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='disable' name='xsaves'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='disable' name='svm'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='require' name='topoext'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='disable' name='npt'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='disable' name='nrip-save'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   </cpu>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <clock offset='utc'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <timer name='pit' tickpolicy='delay'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <timer name='rtc' tickpolicy='catchup'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <timer name='hpet' present='no'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   </clock>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <on_poweroff>destroy</on_poweroff>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <on_reboot>restart</on_reboot>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <on_crash>destroy</on_crash>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <devices>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <disk type='file' device='disk'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <driver name='qemu' type='qcow2' cache='none'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <source file='/var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/disk' index='2'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <backingStore type='file' index='3'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:         <format type='raw'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:         <source file='/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:         <backingStore/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       </backingStore>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target dev='vda' bus='virtio'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='virtio-disk0'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </disk>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <disk type='file' device='cdrom'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <driver name='qemu' type='raw' cache='none'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <source file='/var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/disk.config' index='1'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <backingStore/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target dev='sda' bus='sata'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <readonly/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='sata0-0-0'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </disk>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='pci' index='0' model='pcie-root'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='pcie.0'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='pci' index='1' model='pcie-root-port'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target chassis='1' port='0x10'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='pci.1'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='pci' index='2' model='pcie-root-port'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target chassis='2' port='0x11'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='pci.2'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='pci' index='3' model='pcie-root-port'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target chassis='3' port='0x12'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='pci.3'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='pci' index='4' model='pcie-root-port'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target chassis='4' port='0x13'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='pci.4'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='pci' index='5' model='pcie-root-port'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target chassis='5' port='0x14'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='pci.5'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='pci' index='6' model='pcie-root-port'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target chassis='6' port='0x15'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='pci.6'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='pci' index='7' model='pcie-root-port'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target chassis='7' port='0x16'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='pci.7'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='pci' index='8' model='pcie-root-port'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target chassis='8' port='0x17'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='pci.8'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='pci' index='9' model='pcie-root-port'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target chassis='9' port='0x18'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='pci.9'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='pci' index='10' model='pcie-root-port'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target chassis='10' port='0x19'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='pci.10'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='pci' index='11' model='pcie-root-port'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target chassis='11' port='0x1a'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='pci.11'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='pci' index='12' model='pcie-root-port'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target chassis='12' port='0x1b'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='pci.12'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='pci' index='13' model='pcie-root-port'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target chassis='13' port='0x1c'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='pci.13'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='pci' index='14' model='pcie-root-port'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target chassis='14' port='0x1d'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='pci.14'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='pci' index='15' model='pcie-root-port'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target chassis='15' port='0x1e'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='pci.15'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='pci' index='16' model='pcie-root-port'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target chassis='16' port='0x1f'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='pci.16'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='pci' index='17' model='pcie-root-port'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target chassis='17' port='0x20'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='pci.17'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='pci' index='18' model='pcie-root-port'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target chassis='18' port='0x21'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='pci.18'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='pci' index='19' model='pcie-root-port'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target chassis='19' port='0x22'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='pci.19'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='pci' index='20' model='pcie-root-port'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target chassis='20' port='0x23'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='pci.20'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='pci' index='21' model='pcie-root-port'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target chassis='21' port='0x24'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='pci.21'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='pci' index='22' model='pcie-root-port'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target chassis='22' port='0x25'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='pci.22'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='pci' index='23' model='pcie-root-port'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target chassis='23' port='0x26'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='pci.23'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='pci' index='24' model='pcie-root-port'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target chassis='24' port='0x27'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='pci.24'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='pci' index='25' model='pcie-root-port'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target chassis='25' port='0x28'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='pci.25'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model name='pcie-pci-bridge'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='pci.26'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='usb' index='0' model='piix3-uhci'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='usb'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='sata' index='0'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='ide'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <interface type='ethernet'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <mac address='fa:16:3e:d7:63:9d'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target dev='tap0107be0e-1b'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model type='virtio'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <driver name='vhost' rx_queue_size='512'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <mtu size='1442'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='net0'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </interface>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <serial type='pty'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <source path='/dev/pts/0'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <log file='/var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/console.log' append='off'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target type='isa-serial' port='0'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:         <model name='isa-serial'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       </target>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='serial0'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </serial>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <console type='pty' tty='/dev/pts/0'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <source path='/dev/pts/0'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <log file='/var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/console.log' append='off'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target type='serial' port='0'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='serial0'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </console>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <input type='tablet' bus='usb'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='input0'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='usb' bus='0' port='1'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </input>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <input type='mouse' bus='ps2'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='input1'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </input>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <input type='keyboard' bus='ps2'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='input2'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </input>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <listen type='address' address='::0'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </graphics>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <audio id='1' type='none'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <video>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model type='virtio' heads='1' primary='yes'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='video0'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </video>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <watchdog model='itco' action='reset'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='watchdog0'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </watchdog>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <memballoon model='virtio'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <stats period='10'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='balloon0'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </memballoon>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <rng model='virtio'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <backend model='random'>/dev/urandom</backend>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='rng0'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </rng>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   </devices>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <label>system_u:system_r:svirt_t:s0:c277,c815</label>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c277,c815</imagelabel>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   </seclabel>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <label>+107:+107</label>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <imagelabel>+107:+107</imagelabel>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   </seclabel>
Oct 08 19:08:02 compute-0 nova_compute[117514]: </domain>
Oct 08 19:08:02 compute-0 nova_compute[117514]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Oct 08 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.467 2 DEBUG nova.virt.libvirt.guest [req-f11bb41b-8d96-45c5-b57a-bd4afd2f953b req-72ef385b-bc24-4646-befc-6f649d5ccd73 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:bc:a5:e4"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap6943627d-66"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Oct 08 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.471 2 DEBUG nova.virt.libvirt.guest [req-f11bb41b-8d96-45c5-b57a-bd4afd2f953b req-72ef385b-bc24-4646-befc-6f649d5ccd73 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:bc:a5:e4"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap6943627d-66"/></interface>not found in domain: <domain type='kvm' id='3'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <name>instance-00000003</name>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <uuid>b66b330b-1cad-4dfb-a2f9-83201dc8ee32</uuid>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <metadata>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <nova:name>tempest-TestNetworkBasicOps-server-602516393</nova:name>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <nova:creationTime>2025-10-08 19:08:01</nova:creationTime>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <nova:flavor name="m1.nano">
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <nova:memory>128</nova:memory>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <nova:disk>1</nova:disk>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <nova:swap>0</nova:swap>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <nova:ephemeral>0</nova:ephemeral>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <nova:vcpus>1</nova:vcpus>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   </nova:flavor>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <nova:owner>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <nova:user uuid="efdb1424acdb478684cdb088b373ba05">tempest-TestNetworkBasicOps-1122149477-project-member</nova:user>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <nova:project uuid="b7f7c752a9c5498f8eda73e461895ac9">tempest-TestNetworkBasicOps-1122149477</nova:project>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   </nova:owner>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <nova:root type="image" uuid="23cfa426-7011-4566-992d-1c7af39f70dd"/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <nova:ports>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <nova:port uuid="0107be0e-1b4b-47dd-9422-a435ded0964c">
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </nova:port>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   </nova:ports>
Oct 08 19:08:02 compute-0 nova_compute[117514]: </nova:instance>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   </metadata>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <memory unit='KiB'>131072</memory>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <currentMemory unit='KiB'>131072</currentMemory>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <vcpu placement='static'>1</vcpu>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <resource>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <partition>/machine</partition>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   </resource>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <sysinfo type='smbios'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <system>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <entry name='manufacturer'>RDO</entry>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <entry name='product'>OpenStack Compute</entry>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <entry name='serial'>b66b330b-1cad-4dfb-a2f9-83201dc8ee32</entry>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <entry name='uuid'>b66b330b-1cad-4dfb-a2f9-83201dc8ee32</entry>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <entry name='family'>Virtual Machine</entry>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </system>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   </sysinfo>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <os>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <type arch='x86_64' machine='pc-q35-rhel9.6.0'>hvm</type>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <boot dev='hd'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <smbios mode='sysinfo'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   </os>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <features>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <acpi/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <apic/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <vmcoreinfo state='on'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   </features>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <cpu mode='custom' match='exact' check='full'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <model fallback='forbid'>EPYC-Rome</model>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <vendor>AMD</vendor>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='require' name='x2apic'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='require' name='tsc-deadline'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='require' name='hypervisor'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='require' name='tsc_adjust'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='require' name='spec-ctrl'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='require' name='stibp'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='require' name='arch-capabilities'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='require' name='ssbd'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='require' name='cmp_legacy'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='require' name='overflow-recov'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='require' name='succor'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='require' name='ibrs'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='require' name='amd-ssbd'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='require' name='virt-ssbd'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='disable' name='lbrv'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='disable' name='tsc-scale'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='disable' name='vmcb-clean'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='disable' name='flushbyasid'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='disable' name='pause-filter'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='disable' name='pfthreshold'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='disable' name='svme-addr-chk'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='require' name='lfence-always-serializing'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='require' name='rdctl-no'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='require' name='mds-no'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='require' name='pschange-mc-no'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='require' name='gds-no'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='require' name='rfds-no'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='disable' name='xsaves'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='disable' name='svm'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='require' name='topoext'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='disable' name='npt'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <feature policy='disable' name='nrip-save'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   </cpu>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <clock offset='utc'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <timer name='pit' tickpolicy='delay'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <timer name='rtc' tickpolicy='catchup'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <timer name='hpet' present='no'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   </clock>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <on_poweroff>destroy</on_poweroff>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <on_reboot>restart</on_reboot>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <on_crash>destroy</on_crash>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <devices>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <disk type='file' device='disk'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <driver name='qemu' type='qcow2' cache='none'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <source file='/var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/disk' index='2'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <backingStore type='file' index='3'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:         <format type='raw'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:         <source file='/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:         <backingStore/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       </backingStore>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target dev='vda' bus='virtio'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='virtio-disk0'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </disk>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <disk type='file' device='cdrom'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <driver name='qemu' type='raw' cache='none'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <source file='/var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/disk.config' index='1'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <backingStore/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target dev='sda' bus='sata'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <readonly/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='sata0-0-0'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </disk>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='pci' index='0' model='pcie-root'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='pcie.0'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='pci' index='1' model='pcie-root-port'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target chassis='1' port='0x10'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='pci.1'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='pci' index='2' model='pcie-root-port'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target chassis='2' port='0x11'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='pci.2'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='pci' index='3' model='pcie-root-port'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target chassis='3' port='0x12'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='pci.3'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='pci' index='4' model='pcie-root-port'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target chassis='4' port='0x13'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='pci.4'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='pci' index='5' model='pcie-root-port'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target chassis='5' port='0x14'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='pci.5'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='pci' index='6' model='pcie-root-port'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target chassis='6' port='0x15'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='pci.6'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='pci' index='7' model='pcie-root-port'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target chassis='7' port='0x16'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='pci.7'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='pci' index='8' model='pcie-root-port'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target chassis='8' port='0x17'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='pci.8'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='pci' index='9' model='pcie-root-port'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target chassis='9' port='0x18'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='pci.9'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='pci' index='10' model='pcie-root-port'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target chassis='10' port='0x19'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='pci.10'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='pci' index='11' model='pcie-root-port'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target chassis='11' port='0x1a'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='pci.11'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='pci' index='12' model='pcie-root-port'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target chassis='12' port='0x1b'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='pci.12'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='pci' index='13' model='pcie-root-port'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target chassis='13' port='0x1c'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='pci.13'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='pci' index='14' model='pcie-root-port'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target chassis='14' port='0x1d'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='pci.14'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='pci' index='15' model='pcie-root-port'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target chassis='15' port='0x1e'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='pci.15'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='pci' index='16' model='pcie-root-port'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target chassis='16' port='0x1f'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='pci.16'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='pci' index='17' model='pcie-root-port'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target chassis='17' port='0x20'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='pci.17'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='pci' index='18' model='pcie-root-port'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target chassis='18' port='0x21'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='pci.18'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='pci' index='19' model='pcie-root-port'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target chassis='19' port='0x22'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='pci.19'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='pci' index='20' model='pcie-root-port'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target chassis='20' port='0x23'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='pci.20'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='pci' index='21' model='pcie-root-port'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target chassis='21' port='0x24'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='pci.21'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='pci' index='22' model='pcie-root-port'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target chassis='22' port='0x25'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='pci.22'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='pci' index='23' model='pcie-root-port'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target chassis='23' port='0x26'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='pci.23'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='pci' index='24' model='pcie-root-port'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target chassis='24' port='0x27'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='pci.24'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='pci' index='25' model='pcie-root-port'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target chassis='25' port='0x28'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='pci.25'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model name='pcie-pci-bridge'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='pci.26'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='usb' index='0' model='piix3-uhci'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='usb'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <controller type='sata' index='0'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='ide'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <interface type='ethernet'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <mac address='fa:16:3e:d7:63:9d'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target dev='tap0107be0e-1b'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model type='virtio'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <driver name='vhost' rx_queue_size='512'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <mtu size='1442'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='net0'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </interface>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <serial type='pty'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <source path='/dev/pts/0'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <log file='/var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/console.log' append='off'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target type='isa-serial' port='0'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:         <model name='isa-serial'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       </target>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='serial0'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </serial>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <console type='pty' tty='/dev/pts/0'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <source path='/dev/pts/0'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <log file='/var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/console.log' append='off'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <target type='serial' port='0'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='serial0'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </console>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <input type='tablet' bus='usb'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='input0'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='usb' bus='0' port='1'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </input>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <input type='mouse' bus='ps2'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='input1'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </input>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <input type='keyboard' bus='ps2'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='input2'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </input>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <listen type='address' address='::0'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </graphics>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <audio id='1' type='none'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <video>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <model type='virtio' heads='1' primary='yes'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='video0'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </video>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <watchdog model='itco' action='reset'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='watchdog0'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </watchdog>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <memballoon model='virtio'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <stats period='10'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='balloon0'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </memballoon>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <rng model='virtio'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <backend model='random'>/dev/urandom</backend>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <alias name='rng0'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </rng>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   </devices>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <label>system_u:system_r:svirt_t:s0:c277,c815</label>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c277,c815</imagelabel>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   </seclabel>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <label>+107:+107</label>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <imagelabel>+107:+107</imagelabel>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   </seclabel>
Oct 08 19:08:02 compute-0 nova_compute[117514]: </domain>
Oct 08 19:08:02 compute-0 nova_compute[117514]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Oct 08 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.472 2 WARNING nova.virt.libvirt.driver [req-f11bb41b-8d96-45c5-b57a-bd4afd2f953b req-72ef385b-bc24-4646-befc-6f649d5ccd73 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Detaching interface fa:16:3e:bc:a5:e4 failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tap6943627d-66' not found.
Oct 08 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.472 2 DEBUG nova.virt.libvirt.vif [req-f11bb41b-8d96-45c5-b57a-bd4afd2f953b req-72ef385b-bc24-4646-befc-6f649d5ccd73 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T19:07:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-602516393',display_name='tempest-TestNetworkBasicOps-server-602516393',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-602516393',id=3,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC1aTvBPkgf1VfNjvC4uuKKg+tISnXImijmvs2cGp+FtgeRrvxYh9lBxLRU9xSzH0Z6LaCabBaf6NwgK+eU8uEwumcvsX4qsd2EcbV6VjIknh+8LBbcMTdQeQSSFJx6qhQ==',key_name='tempest-TestNetworkBasicOps-1029193278',keypairs=<?>,launch_index=0,launched_at=2025-10-08T19:07:32Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-dwebwbaf',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T19:07:32Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=b66b330b-1cad-4dfb-a2f9-83201dc8ee32,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6943627d-6614-41cb-9460-f0454c6defb1", "address": "fa:16:3e:bc:a5:e4", "network": {"id": "c73d9547-8a91-4802-82a8-1a3a035fe63c", "bridge": "br-int", "label": "tempest-network-smoke--833981410", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6943627d-66", "ovs_interfaceid": "6943627d-6614-41cb-9460-f0454c6defb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Oct 08 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.473 2 DEBUG nova.network.os_vif_util [req-f11bb41b-8d96-45c5-b57a-bd4afd2f953b req-72ef385b-bc24-4646-befc-6f649d5ccd73 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Converting VIF {"id": "6943627d-6614-41cb-9460-f0454c6defb1", "address": "fa:16:3e:bc:a5:e4", "network": {"id": "c73d9547-8a91-4802-82a8-1a3a035fe63c", "bridge": "br-int", "label": "tempest-network-smoke--833981410", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6943627d-66", "ovs_interfaceid": "6943627d-6614-41cb-9460-f0454c6defb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 08 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.473 2 DEBUG nova.network.os_vif_util [req-f11bb41b-8d96-45c5-b57a-bd4afd2f953b req-72ef385b-bc24-4646-befc-6f649d5ccd73 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bc:a5:e4,bridge_name='br-int',has_traffic_filtering=True,id=6943627d-6614-41cb-9460-f0454c6defb1,network=Network(c73d9547-8a91-4802-82a8-1a3a035fe63c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6943627d-66') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 08 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.474 2 DEBUG os_vif [req-f11bb41b-8d96-45c5-b57a-bd4afd2f953b req-72ef385b-bc24-4646-befc-6f649d5ccd73 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bc:a5:e4,bridge_name='br-int',has_traffic_filtering=True,id=6943627d-6614-41cb-9460-f0454c6defb1,network=Network(c73d9547-8a91-4802-82a8-1a3a035fe63c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6943627d-66') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Oct 08 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.475 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.475 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6943627d-66, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.476 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 08 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.479 2 INFO os_vif [req-f11bb41b-8d96-45c5-b57a-bd4afd2f953b req-72ef385b-bc24-4646-befc-6f649d5ccd73 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bc:a5:e4,bridge_name='br-int',has_traffic_filtering=True,id=6943627d-6614-41cb-9460-f0454c6defb1,network=Network(c73d9547-8a91-4802-82a8-1a3a035fe63c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6943627d-66')
Oct 08 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.480 2 DEBUG nova.virt.libvirt.guest [req-f11bb41b-8d96-45c5-b57a-bd4afd2f953b req-72ef385b-bc24-4646-befc-6f649d5ccd73 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <nova:name>tempest-TestNetworkBasicOps-server-602516393</nova:name>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <nova:creationTime>2025-10-08 19:08:02</nova:creationTime>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <nova:flavor name="m1.nano">
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <nova:memory>128</nova:memory>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <nova:disk>1</nova:disk>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <nova:swap>0</nova:swap>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <nova:ephemeral>0</nova:ephemeral>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <nova:vcpus>1</nova:vcpus>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   </nova:flavor>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <nova:owner>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <nova:user uuid="efdb1424acdb478684cdb088b373ba05">tempest-TestNetworkBasicOps-1122149477-project-member</nova:user>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <nova:project uuid="b7f7c752a9c5498f8eda73e461895ac9">tempest-TestNetworkBasicOps-1122149477</nova:project>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   </nova:owner>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <nova:root type="image" uuid="23cfa426-7011-4566-992d-1c7af39f70dd"/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   <nova:ports>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     <nova:port uuid="0107be0e-1b4b-47dd-9422-a435ded0964c">
Oct 08 19:08:02 compute-0 nova_compute[117514]:       <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct 08 19:08:02 compute-0 nova_compute[117514]:     </nova:port>
Oct 08 19:08:02 compute-0 nova_compute[117514]:   </nova:ports>
Oct 08 19:08:02 compute-0 nova_compute[117514]: </nova:instance>
Oct 08 19:08:02 compute-0 nova_compute[117514]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Oct 08 19:08:02 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:02.712 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f81f7a-64d8-418a-a74c-b879bd6deb83, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:08:03 compute-0 ovn_controller[19759]: 2025-10-08T19:08:03Z|00061|binding|INFO|Releasing lport b2172a75-691e-43ff-a242-3b06a5bfd197 from this chassis (sb_readonly=0)
Oct 08 19:08:03 compute-0 nova_compute[117514]: 2025-10-08 19:08:03.571 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:03 compute-0 nova_compute[117514]: 2025-10-08 19:08:03.609 2 INFO nova.network.neutron [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Port 6943627d-6614-41cb-9460-f0454c6defb1 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Oct 08 19:08:03 compute-0 nova_compute[117514]: 2025-10-08 19:08:03.610 2 DEBUG nova.network.neutron [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Updating instance_info_cache with network_info: [{"id": "0107be0e-1b4b-47dd-9422-a435ded0964c", "address": "fa:16:3e:d7:63:9d", "network": {"id": "15690acb-54cf-4081-a718-c14a1c0af6a8", "bridge": "br-int", "label": "tempest-network-smoke--977169033", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0107be0e-1b", "ovs_interfaceid": "0107be0e-1b4b-47dd-9422-a435ded0964c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:08:03 compute-0 nova_compute[117514]: 2025-10-08 19:08:03.626 2 DEBUG oslo_concurrency.lockutils [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Releasing lock "refresh_cache-b66b330b-1cad-4dfb-a2f9-83201dc8ee32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:08:03 compute-0 nova_compute[117514]: 2025-10-08 19:08:03.659 2 DEBUG oslo_concurrency.lockutils [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "interface-b66b330b-1cad-4dfb-a2f9-83201dc8ee32-6943627d-6614-41cb-9460-f0454c6defb1" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 2.206s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:08:03 compute-0 nova_compute[117514]: 2025-10-08 19:08:03.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.302 2 DEBUG oslo_concurrency.lockutils [None req-ed0b09e8-01cd-404b-978b-ec0387d5388f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.303 2 DEBUG oslo_concurrency.lockutils [None req-ed0b09e8-01cd-404b-978b-ec0387d5388f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.303 2 DEBUG oslo_concurrency.lockutils [None req-ed0b09e8-01cd-404b-978b-ec0387d5388f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.304 2 DEBUG oslo_concurrency.lockutils [None req-ed0b09e8-01cd-404b-978b-ec0387d5388f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.305 2 DEBUG oslo_concurrency.lockutils [None req-ed0b09e8-01cd-404b-978b-ec0387d5388f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.307 2 INFO nova.compute.manager [None req-ed0b09e8-01cd-404b-978b-ec0387d5388f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Terminating instance
Oct 08 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.309 2 DEBUG nova.compute.manager [None req-ed0b09e8-01cd-404b-978b-ec0387d5388f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Oct 08 19:08:04 compute-0 kernel: tap0107be0e-1b (unregistering): left promiscuous mode
Oct 08 19:08:04 compute-0 NetworkManager[1035]: <info>  [1759950484.3369] device (tap0107be0e-1b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 08 19:08:04 compute-0 ovn_controller[19759]: 2025-10-08T19:08:04Z|00062|binding|INFO|Releasing lport 0107be0e-1b4b-47dd-9422-a435ded0964c from this chassis (sb_readonly=0)
Oct 08 19:08:04 compute-0 ovn_controller[19759]: 2025-10-08T19:08:04Z|00063|binding|INFO|Setting lport 0107be0e-1b4b-47dd-9422-a435ded0964c down in Southbound
Oct 08 19:08:04 compute-0 ovn_controller[19759]: 2025-10-08T19:08:04Z|00064|binding|INFO|Removing iface tap0107be0e-1b ovn-installed in OVS
Oct 08 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:04 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:04.353 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d7:63:9d 10.100.0.6'], port_security=['fa:16:3e:d7:63:9d 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'b66b330b-1cad-4dfb-a2f9-83201dc8ee32', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-15690acb-54cf-4081-a718-c14a1c0af6a8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '4', 'neutron:security_group_ids': '18c7314c-d74a-4643-933f-4dc6b05c33cc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e9980b68-53e4-4dfd-a3d6-cbcaebcf011d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=0107be0e-1b4b-47dd-9422-a435ded0964c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 08 19:08:04 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:04.354 28643 INFO neutron.agent.ovn.metadata.agent [-] Port 0107be0e-1b4b-47dd-9422-a435ded0964c in datapath 15690acb-54cf-4081-a718-c14a1c0af6a8 unbound from our chassis
Oct 08 19:08:04 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:04.355 28643 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 15690acb-54cf-4081-a718-c14a1c0af6a8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 08 19:08:04 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:04.356 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[4ccca8e8-6874-4fc2-b8a8-7480457d14be]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:08:04 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:04.357 28643 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-15690acb-54cf-4081-a718-c14a1c0af6a8 namespace which is not needed anymore
Oct 08 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:04 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000003.scope: Deactivated successfully.
Oct 08 19:08:04 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000003.scope: Consumed 14.532s CPU time.
Oct 08 19:08:04 compute-0 systemd-machined[77568]: Machine qemu-3-instance-00000003 terminated.
Oct 08 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.446 2 DEBUG nova.compute.manager [req-0d9e9642-a7dc-4aaa-87b5-79dd9358d42f req-495efbe4-c5c9-4f3e-80d5-5189ce5e2366 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Received event network-changed-0107be0e-1b4b-47dd-9422-a435ded0964c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.447 2 DEBUG nova.compute.manager [req-0d9e9642-a7dc-4aaa-87b5-79dd9358d42f req-495efbe4-c5c9-4f3e-80d5-5189ce5e2366 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Refreshing instance network info cache due to event network-changed-0107be0e-1b4b-47dd-9422-a435ded0964c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 08 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.448 2 DEBUG oslo_concurrency.lockutils [req-0d9e9642-a7dc-4aaa-87b5-79dd9358d42f req-495efbe4-c5c9-4f3e-80d5-5189ce5e2366 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-b66b330b-1cad-4dfb-a2f9-83201dc8ee32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.448 2 DEBUG oslo_concurrency.lockutils [req-0d9e9642-a7dc-4aaa-87b5-79dd9358d42f req-495efbe4-c5c9-4f3e-80d5-5189ce5e2366 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-b66b330b-1cad-4dfb-a2f9-83201dc8ee32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.449 2 DEBUG nova.network.neutron [req-0d9e9642-a7dc-4aaa-87b5-79dd9358d42f req-495efbe4-c5c9-4f3e-80d5-5189ce5e2366 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Refreshing network info cache for port 0107be0e-1b4b-47dd-9422-a435ded0964c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 08 19:08:04 compute-0 neutron-haproxy-ovnmeta-15690acb-54cf-4081-a718-c14a1c0af6a8[145840]: [NOTICE]   (145844) : haproxy version is 2.8.14-c23fe91
Oct 08 19:08:04 compute-0 neutron-haproxy-ovnmeta-15690acb-54cf-4081-a718-c14a1c0af6a8[145840]: [NOTICE]   (145844) : path to executable is /usr/sbin/haproxy
Oct 08 19:08:04 compute-0 neutron-haproxy-ovnmeta-15690acb-54cf-4081-a718-c14a1c0af6a8[145840]: [WARNING]  (145844) : Exiting Master process...
Oct 08 19:08:04 compute-0 neutron-haproxy-ovnmeta-15690acb-54cf-4081-a718-c14a1c0af6a8[145840]: [ALERT]    (145844) : Current worker (145846) exited with code 143 (Terminated)
Oct 08 19:08:04 compute-0 neutron-haproxy-ovnmeta-15690acb-54cf-4081-a718-c14a1c0af6a8[145840]: [WARNING]  (145844) : All workers exited. Exiting... (0)
Oct 08 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.534 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:04 compute-0 systemd[1]: libpod-f1a0117811421542b95673dae027a361c998bc57c3bcbb56c41602c72d45d71c.scope: Deactivated successfully.
Oct 08 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.539 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:04 compute-0 podman[146251]: 2025-10-08 19:08:04.541412234 +0000 UTC m=+0.064732265 container died f1a0117811421542b95673dae027a361c998bc57c3bcbb56c41602c72d45d71c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-15690acb-54cf-4081-a718-c14a1c0af6a8, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct 08 19:08:04 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f1a0117811421542b95673dae027a361c998bc57c3bcbb56c41602c72d45d71c-userdata-shm.mount: Deactivated successfully.
Oct 08 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.576 2 INFO nova.virt.libvirt.driver [-] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Instance destroyed successfully.
Oct 08 19:08:04 compute-0 systemd[1]: var-lib-containers-storage-overlay-3dc6caed12c224c239b697cd06381493a291f904c3b4b3172b2f62f362bdce12-merged.mount: Deactivated successfully.
Oct 08 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.577 2 DEBUG nova.objects.instance [None req-ed0b09e8-01cd-404b-978b-ec0387d5388f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'resources' on Instance uuid b66b330b-1cad-4dfb-a2f9-83201dc8ee32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 08 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.595 2 DEBUG nova.virt.libvirt.vif [None req-ed0b09e8-01cd-404b-978b-ec0387d5388f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T19:07:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-602516393',display_name='tempest-TestNetworkBasicOps-server-602516393',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-602516393',id=3,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC1aTvBPkgf1VfNjvC4uuKKg+tISnXImijmvs2cGp+FtgeRrvxYh9lBxLRU9xSzH0Z6LaCabBaf6NwgK+eU8uEwumcvsX4qsd2EcbV6VjIknh+8LBbcMTdQeQSSFJx6qhQ==',key_name='tempest-TestNetworkBasicOps-1029193278',keypairs=<?>,launch_index=0,launched_at=2025-10-08T19:07:32Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-dwebwbaf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T19:07:32Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=b66b330b-1cad-4dfb-a2f9-83201dc8ee32,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0107be0e-1b4b-47dd-9422-a435ded0964c", "address": "fa:16:3e:d7:63:9d", "network": {"id": "15690acb-54cf-4081-a718-c14a1c0af6a8", "bridge": "br-int", "label": "tempest-network-smoke--977169033", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0107be0e-1b", "ovs_interfaceid": "0107be0e-1b4b-47dd-9422-a435ded0964c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Oct 08 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.596 2 DEBUG nova.network.os_vif_util [None req-ed0b09e8-01cd-404b-978b-ec0387d5388f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "0107be0e-1b4b-47dd-9422-a435ded0964c", "address": "fa:16:3e:d7:63:9d", "network": {"id": "15690acb-54cf-4081-a718-c14a1c0af6a8", "bridge": "br-int", "label": "tempest-network-smoke--977169033", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0107be0e-1b", "ovs_interfaceid": "0107be0e-1b4b-47dd-9422-a435ded0964c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 08 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.596 2 DEBUG nova.network.os_vif_util [None req-ed0b09e8-01cd-404b-978b-ec0387d5388f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d7:63:9d,bridge_name='br-int',has_traffic_filtering=True,id=0107be0e-1b4b-47dd-9422-a435ded0964c,network=Network(15690acb-54cf-4081-a718-c14a1c0af6a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0107be0e-1b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 08 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.597 2 DEBUG os_vif [None req-ed0b09e8-01cd-404b-978b-ec0387d5388f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d7:63:9d,bridge_name='br-int',has_traffic_filtering=True,id=0107be0e-1b4b-47dd-9422-a435ded0964c,network=Network(15690acb-54cf-4081-a718-c14a1c0af6a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0107be0e-1b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Oct 08 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.599 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0107be0e-1b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.600 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.603 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 08 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.606 2 INFO os_vif [None req-ed0b09e8-01cd-404b-978b-ec0387d5388f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d7:63:9d,bridge_name='br-int',has_traffic_filtering=True,id=0107be0e-1b4b-47dd-9422-a435ded0964c,network=Network(15690acb-54cf-4081-a718-c14a1c0af6a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0107be0e-1b')
Oct 08 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.607 2 INFO nova.virt.libvirt.driver [None req-ed0b09e8-01cd-404b-978b-ec0387d5388f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Deleting instance files /var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32_del
Oct 08 19:08:04 compute-0 podman[146251]: 2025-10-08 19:08:04.60755249 +0000 UTC m=+0.130872531 container cleanup f1a0117811421542b95673dae027a361c998bc57c3bcbb56c41602c72d45d71c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-15690acb-54cf-4081-a718-c14a1c0af6a8, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 08 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.608 2 INFO nova.virt.libvirt.driver [None req-ed0b09e8-01cd-404b-978b-ec0387d5388f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Deletion of /var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32_del complete
Oct 08 19:08:04 compute-0 systemd[1]: libpod-conmon-f1a0117811421542b95673dae027a361c998bc57c3bcbb56c41602c72d45d71c.scope: Deactivated successfully.
Oct 08 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.662 2 INFO nova.compute.manager [None req-ed0b09e8-01cd-404b-978b-ec0387d5388f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Took 0.35 seconds to destroy the instance on the hypervisor.
Oct 08 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.662 2 DEBUG oslo.service.loopingcall [None req-ed0b09e8-01cd-404b-978b-ec0387d5388f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Oct 08 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.663 2 DEBUG nova.compute.manager [-] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Oct 08 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.663 2 DEBUG nova.network.neutron [-] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Oct 08 19:08:04 compute-0 podman[146298]: 2025-10-08 19:08:04.675035896 +0000 UTC m=+0.044570349 container remove f1a0117811421542b95673dae027a361c998bc57c3bcbb56c41602c72d45d71c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-15690acb-54cf-4081-a718-c14a1c0af6a8, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 08 19:08:04 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:04.682 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[8595acef-2323-4c41-a767-ef6219114b33]: (4, ('Wed Oct  8 07:08:04 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-15690acb-54cf-4081-a718-c14a1c0af6a8 (f1a0117811421542b95673dae027a361c998bc57c3bcbb56c41602c72d45d71c)\nf1a0117811421542b95673dae027a361c998bc57c3bcbb56c41602c72d45d71c\nWed Oct  8 07:08:04 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-15690acb-54cf-4081-a718-c14a1c0af6a8 (f1a0117811421542b95673dae027a361c998bc57c3bcbb56c41602c72d45d71c)\nf1a0117811421542b95673dae027a361c998bc57c3bcbb56c41602c72d45d71c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:08:04 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:04.683 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[1115fbbd-5af1-4285-ae5e-25479fbf8592]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:08:04 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:04.684 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap15690acb-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.686 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:04 compute-0 kernel: tap15690acb-50: left promiscuous mode
Oct 08 19:08:04 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:04.690 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[1abf3e45-c32a-499d-bd70-6dd0a1a409d4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.701 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.712 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:08:04 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:04.715 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[eb067c56-ca49-45c3-8382-ea3392983c35]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:08:04 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:04.716 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[238d5399-05f5-47b9-8d0a-b58a19fa3228]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.717 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 08 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.717 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 08 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.735 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875
Oct 08 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.735 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 08 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.736 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.736 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:08:04 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:04.741 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[33dd0ff9-ae89-48d0-9230-fee19e264ae7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 112111, 'reachable_time': 33419, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 146313, 'error': None, 'target': 'ovnmeta-15690acb-54cf-4081-a718-c14a1c0af6a8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:08:04 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:04.743 28783 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-15690acb-54cf-4081-a718-c14a1c0af6a8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Oct 08 19:08:04 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:04.743 28783 DEBUG oslo.privsep.daemon [-] privsep: reply[52f196b6-fd66-4e57-994c-f46511ad677c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:08:04 compute-0 systemd[1]: run-netns-ovnmeta\x2d15690acb\x2d54cf\x2d4081\x2da718\x2dc14a1c0af6a8.mount: Deactivated successfully.
Oct 08 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.763 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.763 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.763 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.763 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 08 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.947 2 WARNING nova.virt.libvirt.driver [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 08 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.948 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6100MB free_disk=73.4237289428711GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 08 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.948 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.949 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:08:05 compute-0 nova_compute[117514]: 2025-10-08 19:08:05.013 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Instance b66b330b-1cad-4dfb-a2f9-83201dc8ee32 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 08 19:08:05 compute-0 nova_compute[117514]: 2025-10-08 19:08:05.014 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 08 19:08:05 compute-0 nova_compute[117514]: 2025-10-08 19:08:05.014 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 08 19:08:05 compute-0 nova_compute[117514]: 2025-10-08 19:08:05.065 2 DEBUG nova.compute.provider_tree [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 08 19:08:05 compute-0 nova_compute[117514]: 2025-10-08 19:08:05.079 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 08 19:08:05 compute-0 nova_compute[117514]: 2025-10-08 19:08:05.101 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 08 19:08:05 compute-0 nova_compute[117514]: 2025-10-08 19:08:05.102 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:08:05 compute-0 nova_compute[117514]: 2025-10-08 19:08:05.339 2 DEBUG nova.network.neutron [-] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:08:05 compute-0 nova_compute[117514]: 2025-10-08 19:08:05.356 2 INFO nova.compute.manager [-] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Took 0.69 seconds to deallocate network for instance.
Oct 08 19:08:05 compute-0 nova_compute[117514]: 2025-10-08 19:08:05.397 2 DEBUG oslo_concurrency.lockutils [None req-ed0b09e8-01cd-404b-978b-ec0387d5388f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:08:05 compute-0 nova_compute[117514]: 2025-10-08 19:08:05.397 2 DEBUG oslo_concurrency.lockutils [None req-ed0b09e8-01cd-404b-978b-ec0387d5388f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:08:05 compute-0 nova_compute[117514]: 2025-10-08 19:08:05.453 2 DEBUG nova.compute.provider_tree [None req-ed0b09e8-01cd-404b-978b-ec0387d5388f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 08 19:08:05 compute-0 nova_compute[117514]: 2025-10-08 19:08:05.468 2 DEBUG nova.scheduler.client.report [None req-ed0b09e8-01cd-404b-978b-ec0387d5388f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 08 19:08:05 compute-0 nova_compute[117514]: 2025-10-08 19:08:05.488 2 DEBUG oslo_concurrency.lockutils [None req-ed0b09e8-01cd-404b-978b-ec0387d5388f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.091s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:08:05 compute-0 nova_compute[117514]: 2025-10-08 19:08:05.512 2 INFO nova.scheduler.client.report [None req-ed0b09e8-01cd-404b-978b-ec0387d5388f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Deleted allocations for instance b66b330b-1cad-4dfb-a2f9-83201dc8ee32
Oct 08 19:08:05 compute-0 nova_compute[117514]: 2025-10-08 19:08:05.559 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:05 compute-0 nova_compute[117514]: 2025-10-08 19:08:05.577 2 DEBUG oslo_concurrency.lockutils [None req-ed0b09e8-01cd-404b-978b-ec0387d5388f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.274s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:08:05 compute-0 nova_compute[117514]: 2025-10-08 19:08:05.662 2 DEBUG nova.network.neutron [req-0d9e9642-a7dc-4aaa-87b5-79dd9358d42f req-495efbe4-c5c9-4f3e-80d5-5189ce5e2366 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Updated VIF entry in instance network info cache for port 0107be0e-1b4b-47dd-9422-a435ded0964c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 08 19:08:05 compute-0 nova_compute[117514]: 2025-10-08 19:08:05.663 2 DEBUG nova.network.neutron [req-0d9e9642-a7dc-4aaa-87b5-79dd9358d42f req-495efbe4-c5c9-4f3e-80d5-5189ce5e2366 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Updating instance_info_cache with network_info: [{"id": "0107be0e-1b4b-47dd-9422-a435ded0964c", "address": "fa:16:3e:d7:63:9d", "network": {"id": "15690acb-54cf-4081-a718-c14a1c0af6a8", "bridge": "br-int", "label": "tempest-network-smoke--977169033", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0107be0e-1b", "ovs_interfaceid": "0107be0e-1b4b-47dd-9422-a435ded0964c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:08:05 compute-0 nova_compute[117514]: 2025-10-08 19:08:05.682 2 DEBUG oslo_concurrency.lockutils [req-0d9e9642-a7dc-4aaa-87b5-79dd9358d42f req-495efbe4-c5c9-4f3e-80d5-5189ce5e2366 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-b66b330b-1cad-4dfb-a2f9-83201dc8ee32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:08:06 compute-0 nova_compute[117514]: 2025-10-08 19:08:06.082 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:08:06 compute-0 nova_compute[117514]: 2025-10-08 19:08:06.083 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 08 19:08:06 compute-0 nova_compute[117514]: 2025-10-08 19:08:06.526 2 DEBUG nova.compute.manager [req-466249e3-5d03-4c5c-88bd-c4c5dffb3cec req-bf626d42-a0c6-4fc3-a323-16ffa38d7598 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Received event network-vif-unplugged-0107be0e-1b4b-47dd-9422-a435ded0964c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:08:06 compute-0 nova_compute[117514]: 2025-10-08 19:08:06.527 2 DEBUG oslo_concurrency.lockutils [req-466249e3-5d03-4c5c-88bd-c4c5dffb3cec req-bf626d42-a0c6-4fc3-a323-16ffa38d7598 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:08:06 compute-0 nova_compute[117514]: 2025-10-08 19:08:06.528 2 DEBUG oslo_concurrency.lockutils [req-466249e3-5d03-4c5c-88bd-c4c5dffb3cec req-bf626d42-a0c6-4fc3-a323-16ffa38d7598 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:08:06 compute-0 nova_compute[117514]: 2025-10-08 19:08:06.528 2 DEBUG oslo_concurrency.lockutils [req-466249e3-5d03-4c5c-88bd-c4c5dffb3cec req-bf626d42-a0c6-4fc3-a323-16ffa38d7598 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:08:06 compute-0 nova_compute[117514]: 2025-10-08 19:08:06.528 2 DEBUG nova.compute.manager [req-466249e3-5d03-4c5c-88bd-c4c5dffb3cec req-bf626d42-a0c6-4fc3-a323-16ffa38d7598 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] No waiting events found dispatching network-vif-unplugged-0107be0e-1b4b-47dd-9422-a435ded0964c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 08 19:08:06 compute-0 nova_compute[117514]: 2025-10-08 19:08:06.529 2 WARNING nova.compute.manager [req-466249e3-5d03-4c5c-88bd-c4c5dffb3cec req-bf626d42-a0c6-4fc3-a323-16ffa38d7598 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Received unexpected event network-vif-unplugged-0107be0e-1b4b-47dd-9422-a435ded0964c for instance with vm_state deleted and task_state None.
Oct 08 19:08:06 compute-0 nova_compute[117514]: 2025-10-08 19:08:06.529 2 DEBUG nova.compute.manager [req-466249e3-5d03-4c5c-88bd-c4c5dffb3cec req-bf626d42-a0c6-4fc3-a323-16ffa38d7598 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Received event network-vif-plugged-0107be0e-1b4b-47dd-9422-a435ded0964c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:08:06 compute-0 nova_compute[117514]: 2025-10-08 19:08:06.530 2 DEBUG oslo_concurrency.lockutils [req-466249e3-5d03-4c5c-88bd-c4c5dffb3cec req-bf626d42-a0c6-4fc3-a323-16ffa38d7598 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:08:06 compute-0 nova_compute[117514]: 2025-10-08 19:08:06.530 2 DEBUG oslo_concurrency.lockutils [req-466249e3-5d03-4c5c-88bd-c4c5dffb3cec req-bf626d42-a0c6-4fc3-a323-16ffa38d7598 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:08:06 compute-0 nova_compute[117514]: 2025-10-08 19:08:06.530 2 DEBUG oslo_concurrency.lockutils [req-466249e3-5d03-4c5c-88bd-c4c5dffb3cec req-bf626d42-a0c6-4fc3-a323-16ffa38d7598 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:08:06 compute-0 nova_compute[117514]: 2025-10-08 19:08:06.531 2 DEBUG nova.compute.manager [req-466249e3-5d03-4c5c-88bd-c4c5dffb3cec req-bf626d42-a0c6-4fc3-a323-16ffa38d7598 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] No waiting events found dispatching network-vif-plugged-0107be0e-1b4b-47dd-9422-a435ded0964c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 08 19:08:06 compute-0 nova_compute[117514]: 2025-10-08 19:08:06.531 2 WARNING nova.compute.manager [req-466249e3-5d03-4c5c-88bd-c4c5dffb3cec req-bf626d42-a0c6-4fc3-a323-16ffa38d7598 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Received unexpected event network-vif-plugged-0107be0e-1b4b-47dd-9422-a435ded0964c for instance with vm_state deleted and task_state None.
Oct 08 19:08:06 compute-0 nova_compute[117514]: 2025-10-08 19:08:06.531 2 DEBUG nova.compute.manager [req-466249e3-5d03-4c5c-88bd-c4c5dffb3cec req-bf626d42-a0c6-4fc3-a323-16ffa38d7598 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Received event network-vif-deleted-0107be0e-1b4b-47dd-9422-a435ded0964c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:08:06 compute-0 nova_compute[117514]: 2025-10-08 19:08:06.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:08:07 compute-0 podman[146315]: 2025-10-08 19:08:07.645217755 +0000 UTC m=+0.061874670 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 08 19:08:07 compute-0 nova_compute[117514]: 2025-10-08 19:08:07.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:08:07 compute-0 nova_compute[117514]: 2025-10-08 19:08:07.906 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:08 compute-0 nova_compute[117514]: 2025-10-08 19:08:08.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:08:08.241 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:08:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:08:08.242 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:08:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:08:08.242 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:08:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:08:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:08:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:08:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:08:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:08:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:08:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:08:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:08:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:08:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:08:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:08:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:08:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:08:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:08:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:08:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:08:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:08:08.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:08:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:08:08.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:08:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:08:08.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:08:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:08:08.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:08:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:08:08.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:08:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:08:08.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:08:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:08:08.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:08:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:08:08.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:08:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:08:08.247 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:08:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:08:08.247 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:08:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:08:08.247 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:08:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:08:08.248 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:08:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:08:08.248 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:08:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:08:08.248 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:08:09 compute-0 nova_compute[117514]: 2025-10-08 19:08:09.602 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:10 compute-0 nova_compute[117514]: 2025-10-08 19:08:10.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:14 compute-0 nova_compute[117514]: 2025-10-08 19:08:14.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:15 compute-0 nova_compute[117514]: 2025-10-08 19:08:15.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:17 compute-0 podman[146340]: 2025-10-08 19:08:17.65097625 +0000 UTC m=+0.070733406 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct 08 19:08:19 compute-0 nova_compute[117514]: 2025-10-08 19:08:19.573 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759950484.571772, b66b330b-1cad-4dfb-a2f9-83201dc8ee32 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 08 19:08:19 compute-0 nova_compute[117514]: 2025-10-08 19:08:19.573 2 INFO nova.compute.manager [-] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] VM Stopped (Lifecycle Event)
Oct 08 19:08:19 compute-0 nova_compute[117514]: 2025-10-08 19:08:19.598 2 DEBUG nova.compute.manager [None req-a9d41b9d-dd59-4090-8238-125f25f63212 - - - - - -] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:08:19 compute-0 nova_compute[117514]: 2025-10-08 19:08:19.607 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:20 compute-0 nova_compute[117514]: 2025-10-08 19:08:20.564 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.349 2 DEBUG oslo_concurrency.lockutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "b81092db-79a9-4570-9579-4e100364515a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.349 2 DEBUG oslo_concurrency.lockutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "b81092db-79a9-4570-9579-4e100364515a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.364 2 DEBUG nova.compute.manager [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Oct 08 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.437 2 DEBUG oslo_concurrency.lockutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.438 2 DEBUG oslo_concurrency.lockutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.450 2 DEBUG nova.virt.hardware [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Oct 08 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.451 2 INFO nova.compute.claims [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Claim successful on node compute-0.ctlplane.example.com
Oct 08 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.552 2 DEBUG nova.compute.provider_tree [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 08 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.565 2 DEBUG nova.scheduler.client.report [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 08 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.586 2 DEBUG oslo_concurrency.lockutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.586 2 DEBUG nova.compute.manager [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Oct 08 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.648 2 DEBUG nova.compute.manager [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Oct 08 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.649 2 DEBUG nova.network.neutron [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Oct 08 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.670 2 INFO nova.virt.libvirt.driver [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 08 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.692 2 DEBUG nova.compute.manager [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Oct 08 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.784 2 DEBUG nova.compute.manager [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Oct 08 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.786 2 DEBUG nova.virt.libvirt.driver [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Oct 08 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.786 2 INFO nova.virt.libvirt.driver [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Creating image(s)
Oct 08 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.787 2 DEBUG oslo_concurrency.lockutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "/var/lib/nova/instances/b81092db-79a9-4570-9579-4e100364515a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.787 2 DEBUG oslo_concurrency.lockutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "/var/lib/nova/instances/b81092db-79a9-4570-9579-4e100364515a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.788 2 DEBUG oslo_concurrency.lockutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "/var/lib/nova/instances/b81092db-79a9-4570-9579-4e100364515a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.802 2 DEBUG oslo_concurrency.processutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.825 2 DEBUG nova.policy [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Oct 08 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.868 2 DEBUG oslo_concurrency.processutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.869 2 DEBUG oslo_concurrency.lockutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "008eb3078b811ee47058b7252a820910c35fc6df" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.870 2 DEBUG oslo_concurrency.lockutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.881 2 DEBUG oslo_concurrency.processutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.936 2 DEBUG oslo_concurrency.processutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.937 2 DEBUG oslo_concurrency.processutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df,backing_fmt=raw /var/lib/nova/instances/b81092db-79a9-4570-9579-4e100364515a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.985 2 DEBUG oslo_concurrency.processutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df,backing_fmt=raw /var/lib/nova/instances/b81092db-79a9-4570-9579-4e100364515a/disk 1073741824" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.986 2 DEBUG oslo_concurrency.lockutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.986 2 DEBUG oslo_concurrency.processutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:08:23 compute-0 nova_compute[117514]: 2025-10-08 19:08:23.066 2 DEBUG oslo_concurrency.processutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:08:23 compute-0 nova_compute[117514]: 2025-10-08 19:08:23.067 2 DEBUG nova.virt.disk.api [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Checking if we can resize image /var/lib/nova/instances/b81092db-79a9-4570-9579-4e100364515a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Oct 08 19:08:23 compute-0 nova_compute[117514]: 2025-10-08 19:08:23.068 2 DEBUG oslo_concurrency.processutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b81092db-79a9-4570-9579-4e100364515a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:08:23 compute-0 nova_compute[117514]: 2025-10-08 19:08:23.125 2 DEBUG oslo_concurrency.processutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b81092db-79a9-4570-9579-4e100364515a/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:08:23 compute-0 nova_compute[117514]: 2025-10-08 19:08:23.126 2 DEBUG nova.virt.disk.api [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Cannot resize image /var/lib/nova/instances/b81092db-79a9-4570-9579-4e100364515a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Oct 08 19:08:23 compute-0 nova_compute[117514]: 2025-10-08 19:08:23.127 2 DEBUG nova.objects.instance [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'migration_context' on Instance uuid b81092db-79a9-4570-9579-4e100364515a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 08 19:08:23 compute-0 nova_compute[117514]: 2025-10-08 19:08:23.142 2 DEBUG nova.virt.libvirt.driver [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Oct 08 19:08:23 compute-0 nova_compute[117514]: 2025-10-08 19:08:23.143 2 DEBUG nova.virt.libvirt.driver [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Ensure instance console log exists: /var/lib/nova/instances/b81092db-79a9-4570-9579-4e100364515a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Oct 08 19:08:23 compute-0 nova_compute[117514]: 2025-10-08 19:08:23.144 2 DEBUG oslo_concurrency.lockutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:08:23 compute-0 nova_compute[117514]: 2025-10-08 19:08:23.145 2 DEBUG oslo_concurrency.lockutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:08:23 compute-0 nova_compute[117514]: 2025-10-08 19:08:23.145 2 DEBUG oslo_concurrency.lockutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:08:23 compute-0 podman[146377]: 2025-10-08 19:08:23.674956617 +0000 UTC m=+0.084916961 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct 08 19:08:23 compute-0 podman[146376]: 2025-10-08 19:08:23.686438452 +0000 UTC m=+0.096659154 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, maintainer=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, architecture=x86_64, name=ubi9-minimal, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., config_id=edpm)
Oct 08 19:08:24 compute-0 nova_compute[117514]: 2025-10-08 19:08:24.610 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:25 compute-0 nova_compute[117514]: 2025-10-08 19:08:25.566 2 DEBUG nova.network.neutron [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Successfully created port: 4df96566-2548-47bc-bd48-095ff9ce5a25 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Oct 08 19:08:25 compute-0 nova_compute[117514]: 2025-10-08 19:08:25.612 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:26 compute-0 nova_compute[117514]: 2025-10-08 19:08:26.113 2 DEBUG nova.network.neutron [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Successfully updated port: 4df96566-2548-47bc-bd48-095ff9ce5a25 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Oct 08 19:08:26 compute-0 nova_compute[117514]: 2025-10-08 19:08:26.132 2 DEBUG oslo_concurrency.lockutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "refresh_cache-b81092db-79a9-4570-9579-4e100364515a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:08:26 compute-0 nova_compute[117514]: 2025-10-08 19:08:26.133 2 DEBUG oslo_concurrency.lockutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquired lock "refresh_cache-b81092db-79a9-4570-9579-4e100364515a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:08:26 compute-0 nova_compute[117514]: 2025-10-08 19:08:26.134 2 DEBUG nova.network.neutron [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 08 19:08:26 compute-0 nova_compute[117514]: 2025-10-08 19:08:26.199 2 DEBUG nova.compute.manager [req-8d578775-f424-4733-a45c-99b7475d10fa req-3b39912f-7708-4f89-9b82-95c2372b1b7d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Received event network-changed-4df96566-2548-47bc-bd48-095ff9ce5a25 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:08:26 compute-0 nova_compute[117514]: 2025-10-08 19:08:26.200 2 DEBUG nova.compute.manager [req-8d578775-f424-4733-a45c-99b7475d10fa req-3b39912f-7708-4f89-9b82-95c2372b1b7d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Refreshing instance network info cache due to event network-changed-4df96566-2548-47bc-bd48-095ff9ce5a25. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 08 19:08:26 compute-0 nova_compute[117514]: 2025-10-08 19:08:26.200 2 DEBUG oslo_concurrency.lockutils [req-8d578775-f424-4733-a45c-99b7475d10fa req-3b39912f-7708-4f89-9b82-95c2372b1b7d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-b81092db-79a9-4570-9579-4e100364515a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:08:26 compute-0 nova_compute[117514]: 2025-10-08 19:08:26.255 2 DEBUG nova.network.neutron [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Oct 08 19:08:27 compute-0 podman[146412]: 2025-10-08 19:08:27.667483669 +0000 UTC m=+0.072981863 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 08 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.084 2 DEBUG nova.network.neutron [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Updating instance_info_cache with network_info: [{"id": "4df96566-2548-47bc-bd48-095ff9ce5a25", "address": "fa:16:3e:f7:31:02", "network": {"id": "820a3a2e-47e5-4f6d-88d6-281476a31fb1", "bridge": "br-int", "label": "tempest-network-smoke--67383231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4df96566-25", "ovs_interfaceid": "4df96566-2548-47bc-bd48-095ff9ce5a25", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.102 2 DEBUG oslo_concurrency.lockutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Releasing lock "refresh_cache-b81092db-79a9-4570-9579-4e100364515a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.102 2 DEBUG nova.compute.manager [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Instance network_info: |[{"id": "4df96566-2548-47bc-bd48-095ff9ce5a25", "address": "fa:16:3e:f7:31:02", "network": {"id": "820a3a2e-47e5-4f6d-88d6-281476a31fb1", "bridge": "br-int", "label": "tempest-network-smoke--67383231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4df96566-25", "ovs_interfaceid": "4df96566-2548-47bc-bd48-095ff9ce5a25", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Oct 08 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.103 2 DEBUG oslo_concurrency.lockutils [req-8d578775-f424-4733-a45c-99b7475d10fa req-3b39912f-7708-4f89-9b82-95c2372b1b7d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-b81092db-79a9-4570-9579-4e100364515a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.103 2 DEBUG nova.network.neutron [req-8d578775-f424-4733-a45c-99b7475d10fa req-3b39912f-7708-4f89-9b82-95c2372b1b7d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Refreshing network info cache for port 4df96566-2548-47bc-bd48-095ff9ce5a25 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 08 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.109 2 DEBUG nova.virt.libvirt.driver [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Start _get_guest_xml network_info=[{"id": "4df96566-2548-47bc-bd48-095ff9ce5a25", "address": "fa:16:3e:f7:31:02", "network": {"id": "820a3a2e-47e5-4f6d-88d6-281476a31fb1", "bridge": "br-int", "label": "tempest-network-smoke--67383231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4df96566-25", "ovs_interfaceid": "4df96566-2548-47bc-bd48-095ff9ce5a25", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T19:05:11Z,direct_url=<?>,disk_format='qcow2',id=23cfa426-7011-4566-992d-1c7af39f70dd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0776a2a010754884a7b224f3b08ef53b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T19:05:12Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'guest_format': None, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_options': None, 'image_id': '23cfa426-7011-4566-992d-1c7af39f70dd'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Oct 08 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.117 2 WARNING nova.virt.libvirt.driver [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 08 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.127 2 DEBUG nova.virt.libvirt.host [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Oct 08 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.128 2 DEBUG nova.virt.libvirt.host [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Oct 08 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.133 2 DEBUG nova.virt.libvirt.host [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Oct 08 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.134 2 DEBUG nova.virt.libvirt.host [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Oct 08 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.134 2 DEBUG nova.virt.libvirt.driver [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Oct 08 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.135 2 DEBUG nova.virt.hardware [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T19:05:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='e8a148fc-4419-4813-98ff-a17e2a95609e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T19:05:11Z,direct_url=<?>,disk_format='qcow2',id=23cfa426-7011-4566-992d-1c7af39f70dd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0776a2a010754884a7b224f3b08ef53b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T19:05:12Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Oct 08 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.135 2 DEBUG nova.virt.hardware [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Oct 08 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.135 2 DEBUG nova.virt.hardware [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Oct 08 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.136 2 DEBUG nova.virt.hardware [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Oct 08 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.136 2 DEBUG nova.virt.hardware [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Oct 08 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.136 2 DEBUG nova.virt.hardware [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Oct 08 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.136 2 DEBUG nova.virt.hardware [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Oct 08 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.136 2 DEBUG nova.virt.hardware [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Oct 08 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.137 2 DEBUG nova.virt.hardware [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Oct 08 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.137 2 DEBUG nova.virt.hardware [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Oct 08 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.137 2 DEBUG nova.virt.hardware [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Oct 08 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.140 2 DEBUG nova.virt.libvirt.vif [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T19:08:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1923358122',display_name='tempest-TestNetworkBasicOps-server-1923358122',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1923358122',id=4,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLQLVJLI0B1DuDHRr0xZejVz519BcFo77SQm/iU8QOSD6bvHcTPIzjucvYocQDiXeDjzdepuMi6T99yqrAkyTWA86BuQoBq3ywvQZ7i+b1z4o3zuHDlJxNAK8zAsugXiSA==',key_name='tempest-TestNetworkBasicOps-993932891',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-bunw0mg3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T19:08:22Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=b81092db-79a9-4570-9579-4e100364515a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4df96566-2548-47bc-bd48-095ff9ce5a25", "address": "fa:16:3e:f7:31:02", "network": {"id": "820a3a2e-47e5-4f6d-88d6-281476a31fb1", "bridge": "br-int", "label": "tempest-network-smoke--67383231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4df96566-25", "ovs_interfaceid": "4df96566-2548-47bc-bd48-095ff9ce5a25", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Oct 08 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.141 2 DEBUG nova.network.os_vif_util [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "4df96566-2548-47bc-bd48-095ff9ce5a25", "address": "fa:16:3e:f7:31:02", "network": {"id": "820a3a2e-47e5-4f6d-88d6-281476a31fb1", "bridge": "br-int", "label": "tempest-network-smoke--67383231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4df96566-25", "ovs_interfaceid": "4df96566-2548-47bc-bd48-095ff9ce5a25", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 08 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.141 2 DEBUG nova.network.os_vif_util [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f7:31:02,bridge_name='br-int',has_traffic_filtering=True,id=4df96566-2548-47bc-bd48-095ff9ce5a25,network=Network(820a3a2e-47e5-4f6d-88d6-281476a31fb1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4df96566-25') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 08 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.142 2 DEBUG nova.objects.instance [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'pci_devices' on Instance uuid b81092db-79a9-4570-9579-4e100364515a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 08 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.185 2 DEBUG nova.virt.libvirt.driver [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] End _get_guest_xml xml=<domain type="kvm">
Oct 08 19:08:28 compute-0 nova_compute[117514]:   <uuid>b81092db-79a9-4570-9579-4e100364515a</uuid>
Oct 08 19:08:28 compute-0 nova_compute[117514]:   <name>instance-00000004</name>
Oct 08 19:08:28 compute-0 nova_compute[117514]:   <memory>131072</memory>
Oct 08 19:08:28 compute-0 nova_compute[117514]:   <vcpu>1</vcpu>
Oct 08 19:08:28 compute-0 nova_compute[117514]:   <metadata>
Oct 08 19:08:28 compute-0 nova_compute[117514]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 08 19:08:28 compute-0 nova_compute[117514]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct 08 19:08:28 compute-0 nova_compute[117514]:       <nova:name>tempest-TestNetworkBasicOps-server-1923358122</nova:name>
Oct 08 19:08:28 compute-0 nova_compute[117514]:       <nova:creationTime>2025-10-08 19:08:28</nova:creationTime>
Oct 08 19:08:28 compute-0 nova_compute[117514]:       <nova:flavor name="m1.nano">
Oct 08 19:08:28 compute-0 nova_compute[117514]:         <nova:memory>128</nova:memory>
Oct 08 19:08:28 compute-0 nova_compute[117514]:         <nova:disk>1</nova:disk>
Oct 08 19:08:28 compute-0 nova_compute[117514]:         <nova:swap>0</nova:swap>
Oct 08 19:08:28 compute-0 nova_compute[117514]:         <nova:ephemeral>0</nova:ephemeral>
Oct 08 19:08:28 compute-0 nova_compute[117514]:         <nova:vcpus>1</nova:vcpus>
Oct 08 19:08:28 compute-0 nova_compute[117514]:       </nova:flavor>
Oct 08 19:08:28 compute-0 nova_compute[117514]:       <nova:owner>
Oct 08 19:08:28 compute-0 nova_compute[117514]:         <nova:user uuid="efdb1424acdb478684cdb088b373ba05">tempest-TestNetworkBasicOps-1122149477-project-member</nova:user>
Oct 08 19:08:28 compute-0 nova_compute[117514]:         <nova:project uuid="b7f7c752a9c5498f8eda73e461895ac9">tempest-TestNetworkBasicOps-1122149477</nova:project>
Oct 08 19:08:28 compute-0 nova_compute[117514]:       </nova:owner>
Oct 08 19:08:28 compute-0 nova_compute[117514]:       <nova:root type="image" uuid="23cfa426-7011-4566-992d-1c7af39f70dd"/>
Oct 08 19:08:28 compute-0 nova_compute[117514]:       <nova:ports>
Oct 08 19:08:28 compute-0 nova_compute[117514]:         <nova:port uuid="4df96566-2548-47bc-bd48-095ff9ce5a25">
Oct 08 19:08:28 compute-0 nova_compute[117514]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Oct 08 19:08:28 compute-0 nova_compute[117514]:         </nova:port>
Oct 08 19:08:28 compute-0 nova_compute[117514]:       </nova:ports>
Oct 08 19:08:28 compute-0 nova_compute[117514]:     </nova:instance>
Oct 08 19:08:28 compute-0 nova_compute[117514]:   </metadata>
Oct 08 19:08:28 compute-0 nova_compute[117514]:   <sysinfo type="smbios">
Oct 08 19:08:28 compute-0 nova_compute[117514]:     <system>
Oct 08 19:08:28 compute-0 nova_compute[117514]:       <entry name="manufacturer">RDO</entry>
Oct 08 19:08:28 compute-0 nova_compute[117514]:       <entry name="product">OpenStack Compute</entry>
Oct 08 19:08:28 compute-0 nova_compute[117514]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct 08 19:08:28 compute-0 nova_compute[117514]:       <entry name="serial">b81092db-79a9-4570-9579-4e100364515a</entry>
Oct 08 19:08:28 compute-0 nova_compute[117514]:       <entry name="uuid">b81092db-79a9-4570-9579-4e100364515a</entry>
Oct 08 19:08:28 compute-0 nova_compute[117514]:       <entry name="family">Virtual Machine</entry>
Oct 08 19:08:28 compute-0 nova_compute[117514]:     </system>
Oct 08 19:08:28 compute-0 nova_compute[117514]:   </sysinfo>
Oct 08 19:08:28 compute-0 nova_compute[117514]:   <os>
Oct 08 19:08:28 compute-0 nova_compute[117514]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 08 19:08:28 compute-0 nova_compute[117514]:     <boot dev="hd"/>
Oct 08 19:08:28 compute-0 nova_compute[117514]:     <smbios mode="sysinfo"/>
Oct 08 19:08:28 compute-0 nova_compute[117514]:   </os>
Oct 08 19:08:28 compute-0 nova_compute[117514]:   <features>
Oct 08 19:08:28 compute-0 nova_compute[117514]:     <acpi/>
Oct 08 19:08:28 compute-0 nova_compute[117514]:     <apic/>
Oct 08 19:08:28 compute-0 nova_compute[117514]:     <vmcoreinfo/>
Oct 08 19:08:28 compute-0 nova_compute[117514]:   </features>
Oct 08 19:08:28 compute-0 nova_compute[117514]:   <clock offset="utc">
Oct 08 19:08:28 compute-0 nova_compute[117514]:     <timer name="pit" tickpolicy="delay"/>
Oct 08 19:08:28 compute-0 nova_compute[117514]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 08 19:08:28 compute-0 nova_compute[117514]:     <timer name="hpet" present="no"/>
Oct 08 19:08:28 compute-0 nova_compute[117514]:   </clock>
Oct 08 19:08:28 compute-0 nova_compute[117514]:   <cpu mode="host-model" match="exact">
Oct 08 19:08:28 compute-0 nova_compute[117514]:     <topology sockets="1" cores="1" threads="1"/>
Oct 08 19:08:28 compute-0 nova_compute[117514]:   </cpu>
Oct 08 19:08:28 compute-0 nova_compute[117514]:   <devices>
Oct 08 19:08:28 compute-0 nova_compute[117514]:     <disk type="file" device="disk">
Oct 08 19:08:28 compute-0 nova_compute[117514]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 08 19:08:28 compute-0 nova_compute[117514]:       <source file="/var/lib/nova/instances/b81092db-79a9-4570-9579-4e100364515a/disk"/>
Oct 08 19:08:28 compute-0 nova_compute[117514]:       <target dev="vda" bus="virtio"/>
Oct 08 19:08:28 compute-0 nova_compute[117514]:     </disk>
Oct 08 19:08:28 compute-0 nova_compute[117514]:     <disk type="file" device="cdrom">
Oct 08 19:08:28 compute-0 nova_compute[117514]:       <driver name="qemu" type="raw" cache="none"/>
Oct 08 19:08:28 compute-0 nova_compute[117514]:       <source file="/var/lib/nova/instances/b81092db-79a9-4570-9579-4e100364515a/disk.config"/>
Oct 08 19:08:28 compute-0 nova_compute[117514]:       <target dev="sda" bus="sata"/>
Oct 08 19:08:28 compute-0 nova_compute[117514]:     </disk>
Oct 08 19:08:28 compute-0 nova_compute[117514]:     <interface type="ethernet">
Oct 08 19:08:28 compute-0 nova_compute[117514]:       <mac address="fa:16:3e:f7:31:02"/>
Oct 08 19:08:28 compute-0 nova_compute[117514]:       <model type="virtio"/>
Oct 08 19:08:28 compute-0 nova_compute[117514]:       <driver name="vhost" rx_queue_size="512"/>
Oct 08 19:08:28 compute-0 nova_compute[117514]:       <mtu size="1442"/>
Oct 08 19:08:28 compute-0 nova_compute[117514]:       <target dev="tap4df96566-25"/>
Oct 08 19:08:28 compute-0 nova_compute[117514]:     </interface>
Oct 08 19:08:28 compute-0 nova_compute[117514]:     <serial type="pty">
Oct 08 19:08:28 compute-0 nova_compute[117514]:       <log file="/var/lib/nova/instances/b81092db-79a9-4570-9579-4e100364515a/console.log" append="off"/>
Oct 08 19:08:28 compute-0 nova_compute[117514]:     </serial>
Oct 08 19:08:28 compute-0 nova_compute[117514]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 08 19:08:28 compute-0 nova_compute[117514]:     <video>
Oct 08 19:08:28 compute-0 nova_compute[117514]:       <model type="virtio"/>
Oct 08 19:08:28 compute-0 nova_compute[117514]:     </video>
Oct 08 19:08:28 compute-0 nova_compute[117514]:     <input type="tablet" bus="usb"/>
Oct 08 19:08:28 compute-0 nova_compute[117514]:     <rng model="virtio">
Oct 08 19:08:28 compute-0 nova_compute[117514]:       <backend model="random">/dev/urandom</backend>
Oct 08 19:08:28 compute-0 nova_compute[117514]:     </rng>
Oct 08 19:08:28 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root"/>
Oct 08 19:08:28 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:08:28 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:08:28 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:08:28 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:08:28 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:08:28 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:08:28 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:08:28 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:08:28 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:08:28 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:08:28 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:08:28 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:08:28 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:08:28 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:08:28 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:08:28 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:08:28 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:08:28 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:08:28 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:08:28 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:08:28 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:08:28 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:08:28 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:08:28 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:08:28 compute-0 nova_compute[117514]:     <controller type="usb" index="0"/>
Oct 08 19:08:28 compute-0 nova_compute[117514]:     <memballoon model="virtio">
Oct 08 19:08:28 compute-0 nova_compute[117514]:       <stats period="10"/>
Oct 08 19:08:28 compute-0 nova_compute[117514]:     </memballoon>
Oct 08 19:08:28 compute-0 nova_compute[117514]:   </devices>
Oct 08 19:08:28 compute-0 nova_compute[117514]: </domain>
Oct 08 19:08:28 compute-0 nova_compute[117514]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Oct 08 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.186 2 DEBUG nova.compute.manager [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Preparing to wait for external event network-vif-plugged-4df96566-2548-47bc-bd48-095ff9ce5a25 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Oct 08 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.186 2 DEBUG oslo_concurrency.lockutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "b81092db-79a9-4570-9579-4e100364515a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.186 2 DEBUG oslo_concurrency.lockutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "b81092db-79a9-4570-9579-4e100364515a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.187 2 DEBUG oslo_concurrency.lockutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "b81092db-79a9-4570-9579-4e100364515a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.187 2 DEBUG nova.virt.libvirt.vif [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T19:08:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1923358122',display_name='tempest-TestNetworkBasicOps-server-1923358122',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1923358122',id=4,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLQLVJLI0B1DuDHRr0xZejVz519BcFo77SQm/iU8QOSD6bvHcTPIzjucvYocQDiXeDjzdepuMi6T99yqrAkyTWA86BuQoBq3ywvQZ7i+b1z4o3zuHDlJxNAK8zAsugXiSA==',key_name='tempest-TestNetworkBasicOps-993932891',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-bunw0mg3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T19:08:22Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=b81092db-79a9-4570-9579-4e100364515a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4df96566-2548-47bc-bd48-095ff9ce5a25", "address": "fa:16:3e:f7:31:02", "network": {"id": "820a3a2e-47e5-4f6d-88d6-281476a31fb1", "bridge": "br-int", "label": "tempest-network-smoke--67383231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4df96566-25", "ovs_interfaceid": "4df96566-2548-47bc-bd48-095ff9ce5a25", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Oct 08 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.188 2 DEBUG nova.network.os_vif_util [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "4df96566-2548-47bc-bd48-095ff9ce5a25", "address": "fa:16:3e:f7:31:02", "network": {"id": "820a3a2e-47e5-4f6d-88d6-281476a31fb1", "bridge": "br-int", "label": "tempest-network-smoke--67383231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4df96566-25", "ovs_interfaceid": "4df96566-2548-47bc-bd48-095ff9ce5a25", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 08 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.188 2 DEBUG nova.network.os_vif_util [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f7:31:02,bridge_name='br-int',has_traffic_filtering=True,id=4df96566-2548-47bc-bd48-095ff9ce5a25,network=Network(820a3a2e-47e5-4f6d-88d6-281476a31fb1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4df96566-25') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 08 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.189 2 DEBUG os_vif [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f7:31:02,bridge_name='br-int',has_traffic_filtering=True,id=4df96566-2548-47bc-bd48-095ff9ce5a25,network=Network(820a3a2e-47e5-4f6d-88d6-281476a31fb1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4df96566-25') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Oct 08 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.189 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.189 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.190 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 08 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.192 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.192 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4df96566-25, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.193 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4df96566-25, col_values=(('external_ids', {'iface-id': '4df96566-2548-47bc-bd48-095ff9ce5a25', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f7:31:02', 'vm-uuid': 'b81092db-79a9-4570-9579-4e100364515a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:28 compute-0 NetworkManager[1035]: <info>  [1759950508.1954] manager: (tap4df96566-25): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/43)
Oct 08 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 08 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.205 2 INFO os_vif [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f7:31:02,bridge_name='br-int',has_traffic_filtering=True,id=4df96566-2548-47bc-bd48-095ff9ce5a25,network=Network(820a3a2e-47e5-4f6d-88d6-281476a31fb1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4df96566-25')
Oct 08 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.311 2 DEBUG nova.virt.libvirt.driver [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 08 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.311 2 DEBUG nova.virt.libvirt.driver [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 08 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.312 2 DEBUG nova.virt.libvirt.driver [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No VIF found with MAC fa:16:3e:f7:31:02, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Oct 08 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.313 2 INFO nova.virt.libvirt.driver [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Using config drive
Oct 08 19:08:29 compute-0 nova_compute[117514]: 2025-10-08 19:08:29.274 2 INFO nova.virt.libvirt.driver [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Creating config drive at /var/lib/nova/instances/b81092db-79a9-4570-9579-4e100364515a/disk.config
Oct 08 19:08:29 compute-0 nova_compute[117514]: 2025-10-08 19:08:29.284 2 DEBUG oslo_concurrency.processutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b81092db-79a9-4570-9579-4e100364515a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpce7lpz0x execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:08:29 compute-0 nova_compute[117514]: 2025-10-08 19:08:29.424 2 DEBUG oslo_concurrency.processutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b81092db-79a9-4570-9579-4e100364515a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpce7lpz0x" returned: 0 in 0.140s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:08:29 compute-0 kernel: tap4df96566-25: entered promiscuous mode
Oct 08 19:08:29 compute-0 NetworkManager[1035]: <info>  [1759950509.5342] manager: (tap4df96566-25): new Tun device (/org/freedesktop/NetworkManager/Devices/44)
Oct 08 19:08:29 compute-0 nova_compute[117514]: 2025-10-08 19:08:29.536 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:29 compute-0 ovn_controller[19759]: 2025-10-08T19:08:29Z|00065|binding|INFO|Claiming lport 4df96566-2548-47bc-bd48-095ff9ce5a25 for this chassis.
Oct 08 19:08:29 compute-0 ovn_controller[19759]: 2025-10-08T19:08:29Z|00066|binding|INFO|4df96566-2548-47bc-bd48-095ff9ce5a25: Claiming fa:16:3e:f7:31:02 10.100.0.4
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.550 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f7:31:02 10.100.0.4'], port_security=['fa:16:3e:f7:31:02 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'b81092db-79a9-4570-9579-4e100364515a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-820a3a2e-47e5-4f6d-88d6-281476a31fb1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b3c14dd0-3cf2-41c1-9115-bc2ef0b741ba', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b8f7e04c-5c12-4776-b9f7-f4835ede26c3, chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=4df96566-2548-47bc-bd48-095ff9ce5a25) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.552 28643 INFO neutron.agent.ovn.metadata.agent [-] Port 4df96566-2548-47bc-bd48-095ff9ce5a25 in datapath 820a3a2e-47e5-4f6d-88d6-281476a31fb1 bound to our chassis
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.554 28643 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 820a3a2e-47e5-4f6d-88d6-281476a31fb1
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.568 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[679f2536-32ff-4c7c-8956-b133be81e209]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.569 28643 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap820a3a2e-41 in ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.571 144726 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap820a3a2e-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.571 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[acfedef4-7d07-4662-ac0b-f5e0ff1d190c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.572 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[2821b07c-348a-42bd-90a6-051cf9d75300]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.585 28783 DEBUG oslo.privsep.daemon [-] privsep: reply[31596fec-84c1-4bb5-9775-8f1ce7a72c29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:08:29 compute-0 systemd-udevd[146493]: Network interface NamePolicy= disabled on kernel command line.
Oct 08 19:08:29 compute-0 systemd-machined[77568]: New machine qemu-4-instance-00000004.
Oct 08 19:08:29 compute-0 podman[146450]: 2025-10-08 19:08:29.61581587 +0000 UTC m=+0.094616772 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.616 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[638c5d6c-8051-4157-8585-eb895b47bb16]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:08:29 compute-0 nova_compute[117514]: 2025-10-08 19:08:29.618 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:29 compute-0 systemd[1]: Started Virtual Machine qemu-4-instance-00000004.
Oct 08 19:08:29 compute-0 NetworkManager[1035]: <info>  [1759950509.6215] device (tap4df96566-25): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 08 19:08:29 compute-0 NetworkManager[1035]: <info>  [1759950509.6239] device (tap4df96566-25): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 08 19:08:29 compute-0 ovn_controller[19759]: 2025-10-08T19:08:29Z|00067|binding|INFO|Setting lport 4df96566-2548-47bc-bd48-095ff9ce5a25 ovn-installed in OVS
Oct 08 19:08:29 compute-0 ovn_controller[19759]: 2025-10-08T19:08:29Z|00068|binding|INFO|Setting lport 4df96566-2548-47bc-bd48-095ff9ce5a25 up in Southbound
Oct 08 19:08:29 compute-0 podman[146449]: 2025-10-08 19:08:29.62811415 +0000 UTC m=+0.102485919 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Oct 08 19:08:29 compute-0 nova_compute[117514]: 2025-10-08 19:08:29.634 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.654 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[1f0ba735-c1cc-47df-ab7b-6d6f40d4e39e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:08:29 compute-0 systemd-udevd[146499]: Network interface NamePolicy= disabled on kernel command line.
Oct 08 19:08:29 compute-0 NetworkManager[1035]: <info>  [1759950509.6606] manager: (tap820a3a2e-40): new Veth device (/org/freedesktop/NetworkManager/Devices/45)
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.660 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[d6276589-fc95-4a01-988e-e1facce27afe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.700 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[e416765f-36c8-4a1a-955a-caa8efa6290e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.705 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[dc8881ca-cc76-45e0-a057-f489cb993986]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:08:29 compute-0 NetworkManager[1035]: <info>  [1759950509.7350] device (tap820a3a2e-40): carrier: link connected
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.740 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[48b2f32c-767a-43ee-9cdc-2bb25f1b479e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.763 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[6d47d620-ca0e-4a15-9517-e252d3611eaf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap820a3a2e-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fc:c1:bb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 118031, 'reachable_time': 41703, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 146528, 'error': None, 'target': 'ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.784 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[42bddc48-fc49-4b6a-8fb0-0efb0286b5b9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefc:c1bb'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 118031, 'tstamp': 118031}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 146529, 'error': None, 'target': 'ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.804 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[94ecda1a-b815-424d-ba24-f9788212cb6e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap820a3a2e-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fc:c1:bb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 118031, 'reachable_time': 41703, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 146530, 'error': None, 'target': 'ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.845 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[9924e97f-5376-42e6-b147-cbbc50597c24]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.916 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[772abeb5-c0fa-431b-b3b2-19b67a8b3272]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.918 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap820a3a2e-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.918 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.919 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap820a3a2e-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:08:29 compute-0 nova_compute[117514]: 2025-10-08 19:08:29.921 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:29 compute-0 kernel: tap820a3a2e-40: entered promiscuous mode
Oct 08 19:08:29 compute-0 NetworkManager[1035]: <info>  [1759950509.9230] manager: (tap820a3a2e-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/46)
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.925 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap820a3a2e-40, col_values=(('external_ids', {'iface-id': '9e4e54fa-32ec-4ece-b34d-e4e72c958a54'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:08:29 compute-0 nova_compute[117514]: 2025-10-08 19:08:29.926 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:29 compute-0 nova_compute[117514]: 2025-10-08 19:08:29.928 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.928 28643 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/820a3a2e-47e5-4f6d-88d6-281476a31fb1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/820a3a2e-47e5-4f6d-88d6-281476a31fb1.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.929 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[678df671-a2ea-4078-9210-19e1b73a7859]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.930 28643 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]: global
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]:     log         /dev/log local0 debug
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]:     log-tag     haproxy-metadata-proxy-820a3a2e-47e5-4f6d-88d6-281476a31fb1
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]:     user        root
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]:     group       root
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]:     maxconn     1024
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]:     pidfile     /var/lib/neutron/external/pids/820a3a2e-47e5-4f6d-88d6-281476a31fb1.pid.haproxy
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]:     daemon
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]: 
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]: defaults
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]:     log global
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]:     mode http
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]:     option httplog
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]:     option dontlognull
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]:     option http-server-close
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]:     option forwardfor
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]:     retries                 3
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]:     timeout http-request    30s
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]:     timeout connect         30s
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]:     timeout client          32s
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]:     timeout server          32s
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]:     timeout http-keep-alive 30s
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]: 
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]: 
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]: listen listener
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]:     bind 169.254.169.254:80
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]:     server metadata /var/lib/neutron/metadata_proxy
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]:     http-request add-header X-OVN-Network-ID 820a3a2e-47e5-4f6d-88d6-281476a31fb1
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Oct 08 19:08:29 compute-0 ovn_controller[19759]: 2025-10-08T19:08:29Z|00069|binding|INFO|Releasing lport 9e4e54fa-32ec-4ece-b34d-e4e72c958a54 from this chassis (sb_readonly=0)
Oct 08 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.931 28643 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1', 'env', 'PROCESS_TAG=haproxy-820a3a2e-47e5-4f6d-88d6-281476a31fb1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/820a3a2e-47e5-4f6d-88d6-281476a31fb1.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Oct 08 19:08:29 compute-0 nova_compute[117514]: 2025-10-08 19:08:29.948 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.271 2 DEBUG nova.compute.manager [req-1601b8ce-7163-4bc3-8183-8ba97f510514 req-d4bdb2b1-1a74-45ce-9a06-8b154c98720a bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Received event network-vif-plugged-4df96566-2548-47bc-bd48-095ff9ce5a25 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.272 2 DEBUG oslo_concurrency.lockutils [req-1601b8ce-7163-4bc3-8183-8ba97f510514 req-d4bdb2b1-1a74-45ce-9a06-8b154c98720a bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "b81092db-79a9-4570-9579-4e100364515a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.272 2 DEBUG oslo_concurrency.lockutils [req-1601b8ce-7163-4bc3-8183-8ba97f510514 req-d4bdb2b1-1a74-45ce-9a06-8b154c98720a bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "b81092db-79a9-4570-9579-4e100364515a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.273 2 DEBUG oslo_concurrency.lockutils [req-1601b8ce-7163-4bc3-8183-8ba97f510514 req-d4bdb2b1-1a74-45ce-9a06-8b154c98720a bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "b81092db-79a9-4570-9579-4e100364515a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.273 2 DEBUG nova.compute.manager [req-1601b8ce-7163-4bc3-8183-8ba97f510514 req-d4bdb2b1-1a74-45ce-9a06-8b154c98720a bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Processing event network-vif-plugged-4df96566-2548-47bc-bd48-095ff9ce5a25 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Oct 08 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.336 2 DEBUG nova.network.neutron [req-8d578775-f424-4733-a45c-99b7475d10fa req-3b39912f-7708-4f89-9b82-95c2372b1b7d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Updated VIF entry in instance network info cache for port 4df96566-2548-47bc-bd48-095ff9ce5a25. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 08 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.337 2 DEBUG nova.network.neutron [req-8d578775-f424-4733-a45c-99b7475d10fa req-3b39912f-7708-4f89-9b82-95c2372b1b7d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Updating instance_info_cache with network_info: [{"id": "4df96566-2548-47bc-bd48-095ff9ce5a25", "address": "fa:16:3e:f7:31:02", "network": {"id": "820a3a2e-47e5-4f6d-88d6-281476a31fb1", "bridge": "br-int", "label": "tempest-network-smoke--67383231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4df96566-25", "ovs_interfaceid": "4df96566-2548-47bc-bd48-095ff9ce5a25", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.353 2 DEBUG oslo_concurrency.lockutils [req-8d578775-f424-4733-a45c-99b7475d10fa req-3b39912f-7708-4f89-9b82-95c2372b1b7d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-b81092db-79a9-4570-9579-4e100364515a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:08:30 compute-0 podman[146569]: 2025-10-08 19:08:30.393396691 +0000 UTC m=+0.110553571 container create fb07793201da9ab1609a4f4565bfd293a68536af438cb3b77cd79566b9425f07 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 08 19:08:30 compute-0 podman[146569]: 2025-10-08 19:08:30.307542793 +0000 UTC m=+0.024699683 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7
Oct 08 19:08:30 compute-0 systemd[1]: Started libpod-conmon-fb07793201da9ab1609a4f4565bfd293a68536af438cb3b77cd79566b9425f07.scope.
Oct 08 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.464 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950510.4635644, b81092db-79a9-4570-9579-4e100364515a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 08 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.466 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: b81092db-79a9-4570-9579-4e100364515a] VM Started (Lifecycle Event)
Oct 08 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.469 2 DEBUG nova.compute.manager [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Oct 08 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.479 2 DEBUG nova.virt.libvirt.driver [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Oct 08 19:08:30 compute-0 systemd[1]: Started libcrun container.
Oct 08 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.485 2 INFO nova.virt.libvirt.driver [-] [instance: b81092db-79a9-4570-9579-4e100364515a] Instance spawned successfully.
Oct 08 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.485 2 DEBUG nova.virt.libvirt.driver [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Oct 08 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.489 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: b81092db-79a9-4570-9579-4e100364515a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:08:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58ea3c57cf88dfce89603da495e87146b2bb91b27139ea83f58571fbcf3d370c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 08 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.493 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: b81092db-79a9-4570-9579-4e100364515a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 08 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.506 2 DEBUG nova.virt.libvirt.driver [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.507 2 DEBUG nova.virt.libvirt.driver [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.508 2 DEBUG nova.virt.libvirt.driver [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.508 2 DEBUG nova.virt.libvirt.driver [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.509 2 DEBUG nova.virt.libvirt.driver [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.510 2 DEBUG nova.virt.libvirt.driver [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.516 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: b81092db-79a9-4570-9579-4e100364515a] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 08 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.517 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950510.463702, b81092db-79a9-4570-9579-4e100364515a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 08 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.517 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: b81092db-79a9-4570-9579-4e100364515a] VM Paused (Lifecycle Event)
Oct 08 19:08:30 compute-0 podman[146582]: 2025-10-08 19:08:30.539286562 +0000 UTC m=+0.103328404 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Oct 08 19:08:30 compute-0 podman[146569]: 2025-10-08 19:08:30.53954607 +0000 UTC m=+0.256703000 container init fb07793201da9ab1609a4f4565bfd293a68536af438cb3b77cd79566b9425f07 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 08 19:08:30 compute-0 podman[146569]: 2025-10-08 19:08:30.545666844 +0000 UTC m=+0.262823714 container start fb07793201da9ab1609a4f4565bfd293a68536af438cb3b77cd79566b9425f07 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001)
Oct 08 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.556 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: b81092db-79a9-4570-9579-4e100364515a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.560 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950510.4742796, b81092db-79a9-4570-9579-4e100364515a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 08 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.560 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: b81092db-79a9-4570-9579-4e100364515a] VM Resumed (Lifecycle Event)
Oct 08 19:08:30 compute-0 neutron-haproxy-ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1[146594]: [NOTICE]   (146614) : New worker (146616) forked
Oct 08 19:08:30 compute-0 neutron-haproxy-ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1[146594]: [NOTICE]   (146614) : Loading success.
Oct 08 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.578 2 INFO nova.compute.manager [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Took 7.79 seconds to spawn the instance on the hypervisor.
Oct 08 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.579 2 DEBUG nova.compute.manager [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.580 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: b81092db-79a9-4570-9579-4e100364515a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.585 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: b81092db-79a9-4570-9579-4e100364515a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 08 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.616 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: b81092db-79a9-4570-9579-4e100364515a] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 08 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.634 2 INFO nova.compute.manager [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Took 8.23 seconds to build instance.
Oct 08 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.648 2 DEBUG oslo_concurrency.lockutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "b81092db-79a9-4570-9579-4e100364515a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.299s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:08:32 compute-0 nova_compute[117514]: 2025-10-08 19:08:32.385 2 DEBUG nova.compute.manager [req-ea8b809c-a9ff-41ae-b67d-a79ae3eb998c req-fb490f23-3d26-4541-ae2e-7704eca0c9aa bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Received event network-vif-plugged-4df96566-2548-47bc-bd48-095ff9ce5a25 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:08:32 compute-0 nova_compute[117514]: 2025-10-08 19:08:32.386 2 DEBUG oslo_concurrency.lockutils [req-ea8b809c-a9ff-41ae-b67d-a79ae3eb998c req-fb490f23-3d26-4541-ae2e-7704eca0c9aa bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "b81092db-79a9-4570-9579-4e100364515a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:08:32 compute-0 nova_compute[117514]: 2025-10-08 19:08:32.386 2 DEBUG oslo_concurrency.lockutils [req-ea8b809c-a9ff-41ae-b67d-a79ae3eb998c req-fb490f23-3d26-4541-ae2e-7704eca0c9aa bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "b81092db-79a9-4570-9579-4e100364515a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:08:32 compute-0 nova_compute[117514]: 2025-10-08 19:08:32.387 2 DEBUG oslo_concurrency.lockutils [req-ea8b809c-a9ff-41ae-b67d-a79ae3eb998c req-fb490f23-3d26-4541-ae2e-7704eca0c9aa bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "b81092db-79a9-4570-9579-4e100364515a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:08:32 compute-0 nova_compute[117514]: 2025-10-08 19:08:32.387 2 DEBUG nova.compute.manager [req-ea8b809c-a9ff-41ae-b67d-a79ae3eb998c req-fb490f23-3d26-4541-ae2e-7704eca0c9aa bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] No waiting events found dispatching network-vif-plugged-4df96566-2548-47bc-bd48-095ff9ce5a25 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 08 19:08:32 compute-0 nova_compute[117514]: 2025-10-08 19:08:32.388 2 WARNING nova.compute.manager [req-ea8b809c-a9ff-41ae-b67d-a79ae3eb998c req-fb490f23-3d26-4541-ae2e-7704eca0c9aa bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Received unexpected event network-vif-plugged-4df96566-2548-47bc-bd48-095ff9ce5a25 for instance with vm_state active and task_state None.
Oct 08 19:08:33 compute-0 nova_compute[117514]: 2025-10-08 19:08:33.195 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:34 compute-0 ovn_controller[19759]: 2025-10-08T19:08:34Z|00070|binding|INFO|Releasing lport 9e4e54fa-32ec-4ece-b34d-e4e72c958a54 from this chassis (sb_readonly=0)
Oct 08 19:08:34 compute-0 nova_compute[117514]: 2025-10-08 19:08:34.351 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:34 compute-0 NetworkManager[1035]: <info>  [1759950514.3565] manager: (patch-br-int-to-provnet-64c51c9c-a066-44c7-bc3d-9c8bcfc2a465): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/47)
Oct 08 19:08:34 compute-0 NetworkManager[1035]: <info>  [1759950514.3582] manager: (patch-provnet-64c51c9c-a066-44c7-bc3d-9c8bcfc2a465-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/48)
Oct 08 19:08:34 compute-0 ovn_controller[19759]: 2025-10-08T19:08:34Z|00071|binding|INFO|Releasing lport 9e4e54fa-32ec-4ece-b34d-e4e72c958a54 from this chassis (sb_readonly=0)
Oct 08 19:08:34 compute-0 nova_compute[117514]: 2025-10-08 19:08:34.379 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:34 compute-0 nova_compute[117514]: 2025-10-08 19:08:34.383 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:34 compute-0 nova_compute[117514]: 2025-10-08 19:08:34.784 2 DEBUG nova.compute.manager [req-d2c83d13-68f2-4bb8-aae7-50db4a4e3a0a req-15308620-8bfe-4daf-8fe2-c133abc856cb bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Received event network-changed-4df96566-2548-47bc-bd48-095ff9ce5a25 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:08:34 compute-0 nova_compute[117514]: 2025-10-08 19:08:34.785 2 DEBUG nova.compute.manager [req-d2c83d13-68f2-4bb8-aae7-50db4a4e3a0a req-15308620-8bfe-4daf-8fe2-c133abc856cb bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Refreshing instance network info cache due to event network-changed-4df96566-2548-47bc-bd48-095ff9ce5a25. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 08 19:08:34 compute-0 nova_compute[117514]: 2025-10-08 19:08:34.785 2 DEBUG oslo_concurrency.lockutils [req-d2c83d13-68f2-4bb8-aae7-50db4a4e3a0a req-15308620-8bfe-4daf-8fe2-c133abc856cb bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-b81092db-79a9-4570-9579-4e100364515a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:08:34 compute-0 nova_compute[117514]: 2025-10-08 19:08:34.786 2 DEBUG oslo_concurrency.lockutils [req-d2c83d13-68f2-4bb8-aae7-50db4a4e3a0a req-15308620-8bfe-4daf-8fe2-c133abc856cb bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-b81092db-79a9-4570-9579-4e100364515a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:08:34 compute-0 nova_compute[117514]: 2025-10-08 19:08:34.786 2 DEBUG nova.network.neutron [req-d2c83d13-68f2-4bb8-aae7-50db4a4e3a0a req-15308620-8bfe-4daf-8fe2-c133abc856cb bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Refreshing network info cache for port 4df96566-2548-47bc-bd48-095ff9ce5a25 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 08 19:08:35 compute-0 nova_compute[117514]: 2025-10-08 19:08:35.617 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:36 compute-0 nova_compute[117514]: 2025-10-08 19:08:36.088 2 DEBUG nova.network.neutron [req-d2c83d13-68f2-4bb8-aae7-50db4a4e3a0a req-15308620-8bfe-4daf-8fe2-c133abc856cb bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Updated VIF entry in instance network info cache for port 4df96566-2548-47bc-bd48-095ff9ce5a25. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 08 19:08:36 compute-0 nova_compute[117514]: 2025-10-08 19:08:36.089 2 DEBUG nova.network.neutron [req-d2c83d13-68f2-4bb8-aae7-50db4a4e3a0a req-15308620-8bfe-4daf-8fe2-c133abc856cb bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Updating instance_info_cache with network_info: [{"id": "4df96566-2548-47bc-bd48-095ff9ce5a25", "address": "fa:16:3e:f7:31:02", "network": {"id": "820a3a2e-47e5-4f6d-88d6-281476a31fb1", "bridge": "br-int", "label": "tempest-network-smoke--67383231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4df96566-25", "ovs_interfaceid": "4df96566-2548-47bc-bd48-095ff9ce5a25", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:08:36 compute-0 nova_compute[117514]: 2025-10-08 19:08:36.108 2 DEBUG oslo_concurrency.lockutils [req-d2c83d13-68f2-4bb8-aae7-50db4a4e3a0a req-15308620-8bfe-4daf-8fe2-c133abc856cb bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-b81092db-79a9-4570-9579-4e100364515a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:08:38 compute-0 nova_compute[117514]: 2025-10-08 19:08:38.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:38 compute-0 podman[146626]: 2025-10-08 19:08:38.648777806 +0000 UTC m=+0.068493362 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 08 19:08:40 compute-0 nova_compute[117514]: 2025-10-08 19:08:40.619 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:41 compute-0 ovn_controller[19759]: 2025-10-08T19:08:41Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f7:31:02 10.100.0.4
Oct 08 19:08:41 compute-0 ovn_controller[19759]: 2025-10-08T19:08:41Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f7:31:02 10.100.0.4
Oct 08 19:08:43 compute-0 nova_compute[117514]: 2025-10-08 19:08:43.224 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:44.229 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:08:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:44.230 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:08:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:44.232 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:08:45 compute-0 nova_compute[117514]: 2025-10-08 19:08:45.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:47 compute-0 nova_compute[117514]: 2025-10-08 19:08:47.128 2 INFO nova.compute.manager [None req-8a926bd0-a0c3-4ef6-99b9-b743398bc5c0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Get console output
Oct 08 19:08:47 compute-0 nova_compute[117514]: 2025-10-08 19:08:47.134 54 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Oct 08 19:08:48 compute-0 nova_compute[117514]: 2025-10-08 19:08:48.272 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:48 compute-0 podman[146664]: 2025-10-08 19:08:48.691261744 +0000 UTC m=+0.099073297 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute)
Oct 08 19:08:50 compute-0 nova_compute[117514]: 2025-10-08 19:08:50.399 2 DEBUG nova.compute.manager [req-4c9adc67-85d1-4b2e-8b2c-bd8ad6742b5d req-e9c4cd01-e14a-4c03-8053-919c35fee654 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Received event network-changed-4df96566-2548-47bc-bd48-095ff9ce5a25 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:08:50 compute-0 nova_compute[117514]: 2025-10-08 19:08:50.400 2 DEBUG nova.compute.manager [req-4c9adc67-85d1-4b2e-8b2c-bd8ad6742b5d req-e9c4cd01-e14a-4c03-8053-919c35fee654 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Refreshing instance network info cache due to event network-changed-4df96566-2548-47bc-bd48-095ff9ce5a25. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 08 19:08:50 compute-0 nova_compute[117514]: 2025-10-08 19:08:50.400 2 DEBUG oslo_concurrency.lockutils [req-4c9adc67-85d1-4b2e-8b2c-bd8ad6742b5d req-e9c4cd01-e14a-4c03-8053-919c35fee654 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-b81092db-79a9-4570-9579-4e100364515a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:08:50 compute-0 nova_compute[117514]: 2025-10-08 19:08:50.401 2 DEBUG oslo_concurrency.lockutils [req-4c9adc67-85d1-4b2e-8b2c-bd8ad6742b5d req-e9c4cd01-e14a-4c03-8053-919c35fee654 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-b81092db-79a9-4570-9579-4e100364515a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:08:50 compute-0 nova_compute[117514]: 2025-10-08 19:08:50.401 2 DEBUG nova.network.neutron [req-4c9adc67-85d1-4b2e-8b2c-bd8ad6742b5d req-e9c4cd01-e14a-4c03-8053-919c35fee654 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Refreshing network info cache for port 4df96566-2548-47bc-bd48-095ff9ce5a25 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 08 19:08:50 compute-0 nova_compute[117514]: 2025-10-08 19:08:50.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:52 compute-0 nova_compute[117514]: 2025-10-08 19:08:52.096 2 DEBUG nova.network.neutron [req-4c9adc67-85d1-4b2e-8b2c-bd8ad6742b5d req-e9c4cd01-e14a-4c03-8053-919c35fee654 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Updated VIF entry in instance network info cache for port 4df96566-2548-47bc-bd48-095ff9ce5a25. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 08 19:08:52 compute-0 nova_compute[117514]: 2025-10-08 19:08:52.096 2 DEBUG nova.network.neutron [req-4c9adc67-85d1-4b2e-8b2c-bd8ad6742b5d req-e9c4cd01-e14a-4c03-8053-919c35fee654 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Updating instance_info_cache with network_info: [{"id": "4df96566-2548-47bc-bd48-095ff9ce5a25", "address": "fa:16:3e:f7:31:02", "network": {"id": "820a3a2e-47e5-4f6d-88d6-281476a31fb1", "bridge": "br-int", "label": "tempest-network-smoke--67383231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4df96566-25", "ovs_interfaceid": "4df96566-2548-47bc-bd48-095ff9ce5a25", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:08:52 compute-0 nova_compute[117514]: 2025-10-08 19:08:52.121 2 DEBUG oslo_concurrency.lockutils [req-4c9adc67-85d1-4b2e-8b2c-bd8ad6742b5d req-e9c4cd01-e14a-4c03-8053-919c35fee654 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-b81092db-79a9-4570-9579-4e100364515a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:08:53 compute-0 nova_compute[117514]: 2025-10-08 19:08:53.277 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:54 compute-0 podman[146685]: 2025-10-08 19:08:54.656773657 +0000 UTC m=+0.070528253 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, release=1755695350, version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, config_id=edpm, container_name=openstack_network_exporter)
Oct 08 19:08:54 compute-0 podman[146686]: 2025-10-08 19:08:54.663807282 +0000 UTC m=+0.071257903 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 08 19:08:55 compute-0 nova_compute[117514]: 2025-10-08 19:08:55.699 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.129 2 DEBUG oslo_concurrency.lockutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "5f1c7c12-d16a-4158-9af6-e40d7ad01f2e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.130 2 DEBUG oslo_concurrency.lockutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "5f1c7c12-d16a-4158-9af6-e40d7ad01f2e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.152 2 DEBUG nova.compute.manager [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Oct 08 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.231 2 DEBUG oslo_concurrency.lockutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.232 2 DEBUG oslo_concurrency.lockutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.240 2 DEBUG nova.virt.hardware [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Oct 08 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.240 2 INFO nova.compute.claims [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Claim successful on node compute-0.ctlplane.example.com
Oct 08 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.321 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.391 2 DEBUG nova.compute.provider_tree [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 08 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.411 2 DEBUG nova.scheduler.client.report [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 08 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.447 2 DEBUG oslo_concurrency.lockutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.215s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.448 2 DEBUG nova.compute.manager [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Oct 08 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.546 2 DEBUG nova.compute.manager [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Oct 08 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.547 2 DEBUG nova.network.neutron [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Oct 08 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.570 2 INFO nova.virt.libvirt.driver [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 08 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.593 2 DEBUG nova.compute.manager [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Oct 08 19:08:58 compute-0 podman[146725]: 2025-10-08 19:08:58.66721418 +0000 UTC m=+0.079004200 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 08 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.713 2 DEBUG nova.compute.manager [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Oct 08 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.715 2 DEBUG nova.virt.libvirt.driver [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Oct 08 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.716 2 INFO nova.virt.libvirt.driver [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Creating image(s)
Oct 08 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.717 2 DEBUG oslo_concurrency.lockutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "/var/lib/nova/instances/5f1c7c12-d16a-4158-9af6-e40d7ad01f2e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.717 2 DEBUG oslo_concurrency.lockutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "/var/lib/nova/instances/5f1c7c12-d16a-4158-9af6-e40d7ad01f2e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.718 2 DEBUG oslo_concurrency.lockutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "/var/lib/nova/instances/5f1c7c12-d16a-4158-9af6-e40d7ad01f2e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.743 2 DEBUG oslo_concurrency.processutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.834 2 DEBUG oslo_concurrency.processutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.835 2 DEBUG oslo_concurrency.lockutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "008eb3078b811ee47058b7252a820910c35fc6df" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.836 2 DEBUG oslo_concurrency.lockutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.859 2 DEBUG oslo_concurrency.processutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.941 2 DEBUG oslo_concurrency.processutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.942 2 DEBUG oslo_concurrency.processutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df,backing_fmt=raw /var/lib/nova/instances/5f1c7c12-d16a-4158-9af6-e40d7ad01f2e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:08:59 compute-0 nova_compute[117514]: 2025-10-08 19:08:59.012 2 DEBUG oslo_concurrency.processutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df,backing_fmt=raw /var/lib/nova/instances/5f1c7c12-d16a-4158-9af6-e40d7ad01f2e/disk 1073741824" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:08:59 compute-0 nova_compute[117514]: 2025-10-08 19:08:59.013 2 DEBUG oslo_concurrency.lockutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.177s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:08:59 compute-0 nova_compute[117514]: 2025-10-08 19:08:59.014 2 DEBUG oslo_concurrency.processutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:08:59 compute-0 nova_compute[117514]: 2025-10-08 19:08:59.102 2 DEBUG nova.policy [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Oct 08 19:08:59 compute-0 nova_compute[117514]: 2025-10-08 19:08:59.105 2 DEBUG oslo_concurrency.processutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:08:59 compute-0 nova_compute[117514]: 2025-10-08 19:08:59.106 2 DEBUG nova.virt.disk.api [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Checking if we can resize image /var/lib/nova/instances/5f1c7c12-d16a-4158-9af6-e40d7ad01f2e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Oct 08 19:08:59 compute-0 nova_compute[117514]: 2025-10-08 19:08:59.106 2 DEBUG oslo_concurrency.processutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5f1c7c12-d16a-4158-9af6-e40d7ad01f2e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:08:59 compute-0 nova_compute[117514]: 2025-10-08 19:08:59.167 2 DEBUG oslo_concurrency.processutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5f1c7c12-d16a-4158-9af6-e40d7ad01f2e/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:08:59 compute-0 nova_compute[117514]: 2025-10-08 19:08:59.169 2 DEBUG nova.virt.disk.api [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Cannot resize image /var/lib/nova/instances/5f1c7c12-d16a-4158-9af6-e40d7ad01f2e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Oct 08 19:08:59 compute-0 nova_compute[117514]: 2025-10-08 19:08:59.169 2 DEBUG nova.objects.instance [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'migration_context' on Instance uuid 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 08 19:08:59 compute-0 nova_compute[117514]: 2025-10-08 19:08:59.184 2 DEBUG nova.virt.libvirt.driver [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Oct 08 19:08:59 compute-0 nova_compute[117514]: 2025-10-08 19:08:59.184 2 DEBUG nova.virt.libvirt.driver [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Ensure instance console log exists: /var/lib/nova/instances/5f1c7c12-d16a-4158-9af6-e40d7ad01f2e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Oct 08 19:08:59 compute-0 nova_compute[117514]: 2025-10-08 19:08:59.185 2 DEBUG oslo_concurrency.lockutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:08:59 compute-0 nova_compute[117514]: 2025-10-08 19:08:59.185 2 DEBUG oslo_concurrency.lockutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:08:59 compute-0 nova_compute[117514]: 2025-10-08 19:08:59.185 2 DEBUG oslo_concurrency.lockutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:09:00 compute-0 nova_compute[117514]: 2025-10-08 19:09:00.200 2 DEBUG nova.network.neutron [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Successfully created port: a70af23b-d9f3-4d3e-96da-692ae05ba88a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Oct 08 19:09:00 compute-0 podman[146766]: 2025-10-08 19:09:00.641607535 +0000 UTC m=+0.059965964 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct 08 19:09:00 compute-0 podman[146764]: 2025-10-08 19:09:00.646407525 +0000 UTC m=+0.065304860 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team)
Oct 08 19:09:00 compute-0 podman[146765]: 2025-10-08 19:09:00.671190409 +0000 UTC m=+0.095466991 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct 08 19:09:00 compute-0 nova_compute[117514]: 2025-10-08 19:09:00.701 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:01 compute-0 nova_compute[117514]: 2025-10-08 19:09:01.104 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:01.105 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a6:75:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '5e:14:dd:63:55:2a'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 08 19:09:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:01.106 28643 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 08 19:09:01 compute-0 nova_compute[117514]: 2025-10-08 19:09:01.317 2 DEBUG nova.network.neutron [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Successfully updated port: a70af23b-d9f3-4d3e-96da-692ae05ba88a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Oct 08 19:09:01 compute-0 nova_compute[117514]: 2025-10-08 19:09:01.335 2 DEBUG oslo_concurrency.lockutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "refresh_cache-5f1c7c12-d16a-4158-9af6-e40d7ad01f2e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:09:01 compute-0 nova_compute[117514]: 2025-10-08 19:09:01.335 2 DEBUG oslo_concurrency.lockutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquired lock "refresh_cache-5f1c7c12-d16a-4158-9af6-e40d7ad01f2e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:09:01 compute-0 nova_compute[117514]: 2025-10-08 19:09:01.335 2 DEBUG nova.network.neutron [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 08 19:09:01 compute-0 nova_compute[117514]: 2025-10-08 19:09:01.405 2 DEBUG nova.compute.manager [req-01ada483-92b7-4585-8c70-768b45246a77 req-3c44a16d-ae04-4a35-9db1-7126826df1b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Received event network-changed-a70af23b-d9f3-4d3e-96da-692ae05ba88a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:09:01 compute-0 nova_compute[117514]: 2025-10-08 19:09:01.406 2 DEBUG nova.compute.manager [req-01ada483-92b7-4585-8c70-768b45246a77 req-3c44a16d-ae04-4a35-9db1-7126826df1b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Refreshing instance network info cache due to event network-changed-a70af23b-d9f3-4d3e-96da-692ae05ba88a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 08 19:09:01 compute-0 nova_compute[117514]: 2025-10-08 19:09:01.406 2 DEBUG oslo_concurrency.lockutils [req-01ada483-92b7-4585-8c70-768b45246a77 req-3c44a16d-ae04-4a35-9db1-7126826df1b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-5f1c7c12-d16a-4158-9af6-e40d7ad01f2e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:09:01 compute-0 nova_compute[117514]: 2025-10-08 19:09:01.470 2 DEBUG nova.network.neutron [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Oct 08 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.555 2 DEBUG nova.network.neutron [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Updating instance_info_cache with network_info: [{"id": "a70af23b-d9f3-4d3e-96da-692ae05ba88a", "address": "fa:16:3e:1c:aa:70", "network": {"id": "820a3a2e-47e5-4f6d-88d6-281476a31fb1", "bridge": "br-int", "label": "tempest-network-smoke--67383231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa70af23b-d9", "ovs_interfaceid": "a70af23b-d9f3-4d3e-96da-692ae05ba88a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.572 2 DEBUG oslo_concurrency.lockutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Releasing lock "refresh_cache-5f1c7c12-d16a-4158-9af6-e40d7ad01f2e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.573 2 DEBUG nova.compute.manager [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Instance network_info: |[{"id": "a70af23b-d9f3-4d3e-96da-692ae05ba88a", "address": "fa:16:3e:1c:aa:70", "network": {"id": "820a3a2e-47e5-4f6d-88d6-281476a31fb1", "bridge": "br-int", "label": "tempest-network-smoke--67383231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa70af23b-d9", "ovs_interfaceid": "a70af23b-d9f3-4d3e-96da-692ae05ba88a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Oct 08 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.574 2 DEBUG oslo_concurrency.lockutils [req-01ada483-92b7-4585-8c70-768b45246a77 req-3c44a16d-ae04-4a35-9db1-7126826df1b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-5f1c7c12-d16a-4158-9af6-e40d7ad01f2e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.574 2 DEBUG nova.network.neutron [req-01ada483-92b7-4585-8c70-768b45246a77 req-3c44a16d-ae04-4a35-9db1-7126826df1b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Refreshing network info cache for port a70af23b-d9f3-4d3e-96da-692ae05ba88a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 08 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.579 2 DEBUG nova.virt.libvirt.driver [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Start _get_guest_xml network_info=[{"id": "a70af23b-d9f3-4d3e-96da-692ae05ba88a", "address": "fa:16:3e:1c:aa:70", "network": {"id": "820a3a2e-47e5-4f6d-88d6-281476a31fb1", "bridge": "br-int", "label": "tempest-network-smoke--67383231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa70af23b-d9", "ovs_interfaceid": "a70af23b-d9f3-4d3e-96da-692ae05ba88a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T19:05:11Z,direct_url=<?>,disk_format='qcow2',id=23cfa426-7011-4566-992d-1c7af39f70dd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0776a2a010754884a7b224f3b08ef53b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T19:05:12Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'guest_format': None, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_options': None, 'image_id': '23cfa426-7011-4566-992d-1c7af39f70dd'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Oct 08 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.586 2 WARNING nova.virt.libvirt.driver [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 08 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.592 2 DEBUG nova.virt.libvirt.host [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Oct 08 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.593 2 DEBUG nova.virt.libvirt.host [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Oct 08 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.598 2 DEBUG nova.virt.libvirt.host [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Oct 08 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.598 2 DEBUG nova.virt.libvirt.host [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Oct 08 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.599 2 DEBUG nova.virt.libvirt.driver [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Oct 08 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.600 2 DEBUG nova.virt.hardware [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T19:05:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='e8a148fc-4419-4813-98ff-a17e2a95609e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T19:05:11Z,direct_url=<?>,disk_format='qcow2',id=23cfa426-7011-4566-992d-1c7af39f70dd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0776a2a010754884a7b224f3b08ef53b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T19:05:12Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Oct 08 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.600 2 DEBUG nova.virt.hardware [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Oct 08 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.601 2 DEBUG nova.virt.hardware [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Oct 08 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.601 2 DEBUG nova.virt.hardware [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Oct 08 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.602 2 DEBUG nova.virt.hardware [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Oct 08 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.602 2 DEBUG nova.virt.hardware [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Oct 08 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.602 2 DEBUG nova.virt.hardware [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Oct 08 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.603 2 DEBUG nova.virt.hardware [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Oct 08 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.603 2 DEBUG nova.virt.hardware [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Oct 08 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.604 2 DEBUG nova.virt.hardware [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Oct 08 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.604 2 DEBUG nova.virt.hardware [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Oct 08 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.610 2 DEBUG nova.virt.libvirt.vif [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T19:08:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2036009954',display_name='tempest-TestNetworkBasicOps-server-2036009954',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2036009954',id=5,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIaQYsS3p/kqkwhPUSySAXSbxOdURRwycGYz8zG+mSEb7vM+V8TI/DnRVmOc+q/Hcp4ljBTmVN8Dn0Fwxkk8IhqlYVJKZ25JiPY8aDaNHw2HT5FEQjUWsRu8yiFEP7RRtA==',key_name='tempest-TestNetworkBasicOps-1573009253',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-4zycm9c0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T19:08:58Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=5f1c7c12-d16a-4158-9af6-e40d7ad01f2e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a70af23b-d9f3-4d3e-96da-692ae05ba88a", "address": "fa:16:3e:1c:aa:70", "network": {"id": "820a3a2e-47e5-4f6d-88d6-281476a31fb1", "bridge": "br-int", "label": "tempest-network-smoke--67383231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa70af23b-d9", "ovs_interfaceid": "a70af23b-d9f3-4d3e-96da-692ae05ba88a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Oct 08 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.610 2 DEBUG nova.network.os_vif_util [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "a70af23b-d9f3-4d3e-96da-692ae05ba88a", "address": "fa:16:3e:1c:aa:70", "network": {"id": "820a3a2e-47e5-4f6d-88d6-281476a31fb1", "bridge": "br-int", "label": "tempest-network-smoke--67383231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa70af23b-d9", "ovs_interfaceid": "a70af23b-d9f3-4d3e-96da-692ae05ba88a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 08 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.611 2 DEBUG nova.network.os_vif_util [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1c:aa:70,bridge_name='br-int',has_traffic_filtering=True,id=a70af23b-d9f3-4d3e-96da-692ae05ba88a,network=Network(820a3a2e-47e5-4f6d-88d6-281476a31fb1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa70af23b-d9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 08 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.613 2 DEBUG nova.objects.instance [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 08 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.628 2 DEBUG nova.virt.libvirt.driver [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] End _get_guest_xml xml=<domain type="kvm">
Oct 08 19:09:02 compute-0 nova_compute[117514]:   <uuid>5f1c7c12-d16a-4158-9af6-e40d7ad01f2e</uuid>
Oct 08 19:09:02 compute-0 nova_compute[117514]:   <name>instance-00000005</name>
Oct 08 19:09:02 compute-0 nova_compute[117514]:   <memory>131072</memory>
Oct 08 19:09:02 compute-0 nova_compute[117514]:   <vcpu>1</vcpu>
Oct 08 19:09:02 compute-0 nova_compute[117514]:   <metadata>
Oct 08 19:09:02 compute-0 nova_compute[117514]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 08 19:09:02 compute-0 nova_compute[117514]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct 08 19:09:02 compute-0 nova_compute[117514]:       <nova:name>tempest-TestNetworkBasicOps-server-2036009954</nova:name>
Oct 08 19:09:02 compute-0 nova_compute[117514]:       <nova:creationTime>2025-10-08 19:09:02</nova:creationTime>
Oct 08 19:09:02 compute-0 nova_compute[117514]:       <nova:flavor name="m1.nano">
Oct 08 19:09:02 compute-0 nova_compute[117514]:         <nova:memory>128</nova:memory>
Oct 08 19:09:02 compute-0 nova_compute[117514]:         <nova:disk>1</nova:disk>
Oct 08 19:09:02 compute-0 nova_compute[117514]:         <nova:swap>0</nova:swap>
Oct 08 19:09:02 compute-0 nova_compute[117514]:         <nova:ephemeral>0</nova:ephemeral>
Oct 08 19:09:02 compute-0 nova_compute[117514]:         <nova:vcpus>1</nova:vcpus>
Oct 08 19:09:02 compute-0 nova_compute[117514]:       </nova:flavor>
Oct 08 19:09:02 compute-0 nova_compute[117514]:       <nova:owner>
Oct 08 19:09:02 compute-0 nova_compute[117514]:         <nova:user uuid="efdb1424acdb478684cdb088b373ba05">tempest-TestNetworkBasicOps-1122149477-project-member</nova:user>
Oct 08 19:09:02 compute-0 nova_compute[117514]:         <nova:project uuid="b7f7c752a9c5498f8eda73e461895ac9">tempest-TestNetworkBasicOps-1122149477</nova:project>
Oct 08 19:09:02 compute-0 nova_compute[117514]:       </nova:owner>
Oct 08 19:09:02 compute-0 nova_compute[117514]:       <nova:root type="image" uuid="23cfa426-7011-4566-992d-1c7af39f70dd"/>
Oct 08 19:09:02 compute-0 nova_compute[117514]:       <nova:ports>
Oct 08 19:09:02 compute-0 nova_compute[117514]:         <nova:port uuid="a70af23b-d9f3-4d3e-96da-692ae05ba88a">
Oct 08 19:09:02 compute-0 nova_compute[117514]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Oct 08 19:09:02 compute-0 nova_compute[117514]:         </nova:port>
Oct 08 19:09:02 compute-0 nova_compute[117514]:       </nova:ports>
Oct 08 19:09:02 compute-0 nova_compute[117514]:     </nova:instance>
Oct 08 19:09:02 compute-0 nova_compute[117514]:   </metadata>
Oct 08 19:09:02 compute-0 nova_compute[117514]:   <sysinfo type="smbios">
Oct 08 19:09:02 compute-0 nova_compute[117514]:     <system>
Oct 08 19:09:02 compute-0 nova_compute[117514]:       <entry name="manufacturer">RDO</entry>
Oct 08 19:09:02 compute-0 nova_compute[117514]:       <entry name="product">OpenStack Compute</entry>
Oct 08 19:09:02 compute-0 nova_compute[117514]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct 08 19:09:02 compute-0 nova_compute[117514]:       <entry name="serial">5f1c7c12-d16a-4158-9af6-e40d7ad01f2e</entry>
Oct 08 19:09:02 compute-0 nova_compute[117514]:       <entry name="uuid">5f1c7c12-d16a-4158-9af6-e40d7ad01f2e</entry>
Oct 08 19:09:02 compute-0 nova_compute[117514]:       <entry name="family">Virtual Machine</entry>
Oct 08 19:09:02 compute-0 nova_compute[117514]:     </system>
Oct 08 19:09:02 compute-0 nova_compute[117514]:   </sysinfo>
Oct 08 19:09:02 compute-0 nova_compute[117514]:   <os>
Oct 08 19:09:02 compute-0 nova_compute[117514]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 08 19:09:02 compute-0 nova_compute[117514]:     <boot dev="hd"/>
Oct 08 19:09:02 compute-0 nova_compute[117514]:     <smbios mode="sysinfo"/>
Oct 08 19:09:02 compute-0 nova_compute[117514]:   </os>
Oct 08 19:09:02 compute-0 nova_compute[117514]:   <features>
Oct 08 19:09:02 compute-0 nova_compute[117514]:     <acpi/>
Oct 08 19:09:02 compute-0 nova_compute[117514]:     <apic/>
Oct 08 19:09:02 compute-0 nova_compute[117514]:     <vmcoreinfo/>
Oct 08 19:09:02 compute-0 nova_compute[117514]:   </features>
Oct 08 19:09:02 compute-0 nova_compute[117514]:   <clock offset="utc">
Oct 08 19:09:02 compute-0 nova_compute[117514]:     <timer name="pit" tickpolicy="delay"/>
Oct 08 19:09:02 compute-0 nova_compute[117514]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 08 19:09:02 compute-0 nova_compute[117514]:     <timer name="hpet" present="no"/>
Oct 08 19:09:02 compute-0 nova_compute[117514]:   </clock>
Oct 08 19:09:02 compute-0 nova_compute[117514]:   <cpu mode="host-model" match="exact">
Oct 08 19:09:02 compute-0 nova_compute[117514]:     <topology sockets="1" cores="1" threads="1"/>
Oct 08 19:09:02 compute-0 nova_compute[117514]:   </cpu>
Oct 08 19:09:02 compute-0 nova_compute[117514]:   <devices>
Oct 08 19:09:02 compute-0 nova_compute[117514]:     <disk type="file" device="disk">
Oct 08 19:09:02 compute-0 nova_compute[117514]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 08 19:09:02 compute-0 nova_compute[117514]:       <source file="/var/lib/nova/instances/5f1c7c12-d16a-4158-9af6-e40d7ad01f2e/disk"/>
Oct 08 19:09:02 compute-0 nova_compute[117514]:       <target dev="vda" bus="virtio"/>
Oct 08 19:09:02 compute-0 nova_compute[117514]:     </disk>
Oct 08 19:09:02 compute-0 nova_compute[117514]:     <disk type="file" device="cdrom">
Oct 08 19:09:02 compute-0 nova_compute[117514]:       <driver name="qemu" type="raw" cache="none"/>
Oct 08 19:09:02 compute-0 nova_compute[117514]:       <source file="/var/lib/nova/instances/5f1c7c12-d16a-4158-9af6-e40d7ad01f2e/disk.config"/>
Oct 08 19:09:02 compute-0 nova_compute[117514]:       <target dev="sda" bus="sata"/>
Oct 08 19:09:02 compute-0 nova_compute[117514]:     </disk>
Oct 08 19:09:02 compute-0 nova_compute[117514]:     <interface type="ethernet">
Oct 08 19:09:02 compute-0 nova_compute[117514]:       <mac address="fa:16:3e:1c:aa:70"/>
Oct 08 19:09:02 compute-0 nova_compute[117514]:       <model type="virtio"/>
Oct 08 19:09:02 compute-0 nova_compute[117514]:       <driver name="vhost" rx_queue_size="512"/>
Oct 08 19:09:02 compute-0 nova_compute[117514]:       <mtu size="1442"/>
Oct 08 19:09:02 compute-0 nova_compute[117514]:       <target dev="tapa70af23b-d9"/>
Oct 08 19:09:02 compute-0 nova_compute[117514]:     </interface>
Oct 08 19:09:02 compute-0 nova_compute[117514]:     <serial type="pty">
Oct 08 19:09:02 compute-0 nova_compute[117514]:       <log file="/var/lib/nova/instances/5f1c7c12-d16a-4158-9af6-e40d7ad01f2e/console.log" append="off"/>
Oct 08 19:09:02 compute-0 nova_compute[117514]:     </serial>
Oct 08 19:09:02 compute-0 nova_compute[117514]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 08 19:09:02 compute-0 nova_compute[117514]:     <video>
Oct 08 19:09:02 compute-0 nova_compute[117514]:       <model type="virtio"/>
Oct 08 19:09:02 compute-0 nova_compute[117514]:     </video>
Oct 08 19:09:02 compute-0 nova_compute[117514]:     <input type="tablet" bus="usb"/>
Oct 08 19:09:02 compute-0 nova_compute[117514]:     <rng model="virtio">
Oct 08 19:09:02 compute-0 nova_compute[117514]:       <backend model="random">/dev/urandom</backend>
Oct 08 19:09:02 compute-0 nova_compute[117514]:     </rng>
Oct 08 19:09:02 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root"/>
Oct 08 19:09:02 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:09:02 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:09:02 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:09:02 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:09:02 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:09:02 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:09:02 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:09:02 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:09:02 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:09:02 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:09:02 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:09:02 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:09:02 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:09:02 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:09:02 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:09:02 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:09:02 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:09:02 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:09:02 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:09:02 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:09:02 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:09:02 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:09:02 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:09:02 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:09:02 compute-0 nova_compute[117514]:     <controller type="usb" index="0"/>
Oct 08 19:09:02 compute-0 nova_compute[117514]:     <memballoon model="virtio">
Oct 08 19:09:02 compute-0 nova_compute[117514]:       <stats period="10"/>
Oct 08 19:09:02 compute-0 nova_compute[117514]:     </memballoon>
Oct 08 19:09:02 compute-0 nova_compute[117514]:   </devices>
Oct 08 19:09:02 compute-0 nova_compute[117514]: </domain>
Oct 08 19:09:02 compute-0 nova_compute[117514]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Oct 08 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.629 2 DEBUG nova.compute.manager [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Preparing to wait for external event network-vif-plugged-a70af23b-d9f3-4d3e-96da-692ae05ba88a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Oct 08 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.630 2 DEBUG oslo_concurrency.lockutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "5f1c7c12-d16a-4158-9af6-e40d7ad01f2e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.631 2 DEBUG oslo_concurrency.lockutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "5f1c7c12-d16a-4158-9af6-e40d7ad01f2e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.631 2 DEBUG oslo_concurrency.lockutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "5f1c7c12-d16a-4158-9af6-e40d7ad01f2e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.632 2 DEBUG nova.virt.libvirt.vif [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T19:08:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2036009954',display_name='tempest-TestNetworkBasicOps-server-2036009954',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2036009954',id=5,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIaQYsS3p/kqkwhPUSySAXSbxOdURRwycGYz8zG+mSEb7vM+V8TI/DnRVmOc+q/Hcp4ljBTmVN8Dn0Fwxkk8IhqlYVJKZ25JiPY8aDaNHw2HT5FEQjUWsRu8yiFEP7RRtA==',key_name='tempest-TestNetworkBasicOps-1573009253',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-4zycm9c0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T19:08:58Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=5f1c7c12-d16a-4158-9af6-e40d7ad01f2e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a70af23b-d9f3-4d3e-96da-692ae05ba88a", "address": "fa:16:3e:1c:aa:70", "network": {"id": "820a3a2e-47e5-4f6d-88d6-281476a31fb1", "bridge": "br-int", "label": "tempest-network-smoke--67383231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa70af23b-d9", "ovs_interfaceid": "a70af23b-d9f3-4d3e-96da-692ae05ba88a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Oct 08 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.633 2 DEBUG nova.network.os_vif_util [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "a70af23b-d9f3-4d3e-96da-692ae05ba88a", "address": "fa:16:3e:1c:aa:70", "network": {"id": "820a3a2e-47e5-4f6d-88d6-281476a31fb1", "bridge": "br-int", "label": "tempest-network-smoke--67383231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa70af23b-d9", "ovs_interfaceid": "a70af23b-d9f3-4d3e-96da-692ae05ba88a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 08 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.634 2 DEBUG nova.network.os_vif_util [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1c:aa:70,bridge_name='br-int',has_traffic_filtering=True,id=a70af23b-d9f3-4d3e-96da-692ae05ba88a,network=Network(820a3a2e-47e5-4f6d-88d6-281476a31fb1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa70af23b-d9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 08 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.636 2 DEBUG os_vif [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1c:aa:70,bridge_name='br-int',has_traffic_filtering=True,id=a70af23b-d9f3-4d3e-96da-692ae05ba88a,network=Network(820a3a2e-47e5-4f6d-88d6-281476a31fb1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa70af23b-d9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Oct 08 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.637 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.638 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 08 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.642 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.643 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa70af23b-d9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.643 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa70af23b-d9, col_values=(('external_ids', {'iface-id': 'a70af23b-d9f3-4d3e-96da-692ae05ba88a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1c:aa:70', 'vm-uuid': '5f1c7c12-d16a-4158-9af6-e40d7ad01f2e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:02 compute-0 NetworkManager[1035]: <info>  [1759950542.6487] manager: (tapa70af23b-d9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/49)
Oct 08 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 08 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.658 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.660 2 INFO os_vif [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1c:aa:70,bridge_name='br-int',has_traffic_filtering=True,id=a70af23b-d9f3-4d3e-96da-692ae05ba88a,network=Network(820a3a2e-47e5-4f6d-88d6-281476a31fb1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa70af23b-d9')
Oct 08 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.720 2 DEBUG nova.virt.libvirt.driver [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 08 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.720 2 DEBUG nova.virt.libvirt.driver [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 08 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.721 2 DEBUG nova.virt.libvirt.driver [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No VIF found with MAC fa:16:3e:1c:aa:70, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Oct 08 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.721 2 INFO nova.virt.libvirt.driver [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Using config drive
Oct 08 19:09:03 compute-0 nova_compute[117514]: 2025-10-08 19:09:03.260 2 INFO nova.virt.libvirt.driver [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Creating config drive at /var/lib/nova/instances/5f1c7c12-d16a-4158-9af6-e40d7ad01f2e/disk.config
Oct 08 19:09:03 compute-0 nova_compute[117514]: 2025-10-08 19:09:03.269 2 DEBUG oslo_concurrency.processutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5f1c7c12-d16a-4158-9af6-e40d7ad01f2e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpt2k8v3fd execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:09:03 compute-0 nova_compute[117514]: 2025-10-08 19:09:03.406 2 DEBUG oslo_concurrency.processutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5f1c7c12-d16a-4158-9af6-e40d7ad01f2e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpt2k8v3fd" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:09:03 compute-0 kernel: tapa70af23b-d9: entered promiscuous mode
Oct 08 19:09:03 compute-0 NetworkManager[1035]: <info>  [1759950543.4873] manager: (tapa70af23b-d9): new Tun device (/org/freedesktop/NetworkManager/Devices/50)
Oct 08 19:09:03 compute-0 ovn_controller[19759]: 2025-10-08T19:09:03Z|00072|binding|INFO|Claiming lport a70af23b-d9f3-4d3e-96da-692ae05ba88a for this chassis.
Oct 08 19:09:03 compute-0 ovn_controller[19759]: 2025-10-08T19:09:03Z|00073|binding|INFO|a70af23b-d9f3-4d3e-96da-692ae05ba88a: Claiming fa:16:3e:1c:aa:70 10.100.0.9
Oct 08 19:09:03 compute-0 nova_compute[117514]: 2025-10-08 19:09:03.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:03.500 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1c:aa:70 10.100.0.9'], port_security=['fa:16:3e:1c:aa:70 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '5f1c7c12-d16a-4158-9af6-e40d7ad01f2e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-820a3a2e-47e5-4f6d-88d6-281476a31fb1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '2', 'neutron:security_group_ids': '02048498-a771-4306-8e83-ef79600f50a0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b8f7e04c-5c12-4776-b9f7-f4835ede26c3, chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=a70af23b-d9f3-4d3e-96da-692ae05ba88a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 08 19:09:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:03.502 28643 INFO neutron.agent.ovn.metadata.agent [-] Port a70af23b-d9f3-4d3e-96da-692ae05ba88a in datapath 820a3a2e-47e5-4f6d-88d6-281476a31fb1 bound to our chassis
Oct 08 19:09:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:03.504 28643 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 820a3a2e-47e5-4f6d-88d6-281476a31fb1
Oct 08 19:09:03 compute-0 ovn_controller[19759]: 2025-10-08T19:09:03Z|00074|binding|INFO|Setting lport a70af23b-d9f3-4d3e-96da-692ae05ba88a ovn-installed in OVS
Oct 08 19:09:03 compute-0 ovn_controller[19759]: 2025-10-08T19:09:03Z|00075|binding|INFO|Setting lport a70af23b-d9f3-4d3e-96da-692ae05ba88a up in Southbound
Oct 08 19:09:03 compute-0 nova_compute[117514]: 2025-10-08 19:09:03.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:03 compute-0 nova_compute[117514]: 2025-10-08 19:09:03.523 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:03.525 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[f091f6fb-9872-4ea4-88e6-ae832917372b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:09:03 compute-0 systemd-udevd[146848]: Network interface NamePolicy= disabled on kernel command line.
Oct 08 19:09:03 compute-0 NetworkManager[1035]: <info>  [1759950543.5553] device (tapa70af23b-d9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 08 19:09:03 compute-0 systemd-machined[77568]: New machine qemu-5-instance-00000005.
Oct 08 19:09:03 compute-0 NetworkManager[1035]: <info>  [1759950543.5566] device (tapa70af23b-d9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 08 19:09:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:03.568 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[ddbb015d-92d0-427a-ade4-ca0de6f9ca3a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:09:03 compute-0 systemd[1]: Started Virtual Machine qemu-5-instance-00000005.
Oct 08 19:09:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:03.574 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[d4475a19-d274-4c01-a139-91b796280b29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:09:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:03.608 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[d441cfad-f7ff-404e-8b0f-227fa0afca04]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:09:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:03.631 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[96e6100d-678e-4bf2-97a5-f6f79865a0e4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap820a3a2e-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fc:c1:bb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 118031, 'reachable_time': 41703, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 146862, 'error': None, 'target': 'ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:09:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:03.652 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[106b205f-8f39-44e2-8ead-2e3239cf51d1]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap820a3a2e-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 118046, 'tstamp': 118046}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 146864, 'error': None, 'target': 'ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap820a3a2e-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 118049, 'tstamp': 118049}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 146864, 'error': None, 'target': 'ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:09:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:03.654 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap820a3a2e-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:09:03 compute-0 nova_compute[117514]: 2025-10-08 19:09:03.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:03.658 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap820a3a2e-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:09:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:03.658 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 08 19:09:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:03.659 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap820a3a2e-40, col_values=(('external_ids', {'iface-id': '9e4e54fa-32ec-4ece-b34d-e4e72c958a54'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:09:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:03.659 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 08 19:09:03 compute-0 nova_compute[117514]: 2025-10-08 19:09:03.705 2 DEBUG nova.compute.manager [req-bb5bf8da-77e7-4702-8d9c-8067d5dbb988 req-5178622a-22b5-4a5a-9970-9ec1fcbd811b bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Received event network-vif-plugged-a70af23b-d9f3-4d3e-96da-692ae05ba88a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:09:03 compute-0 nova_compute[117514]: 2025-10-08 19:09:03.706 2 DEBUG oslo_concurrency.lockutils [req-bb5bf8da-77e7-4702-8d9c-8067d5dbb988 req-5178622a-22b5-4a5a-9970-9ec1fcbd811b bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "5f1c7c12-d16a-4158-9af6-e40d7ad01f2e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:09:03 compute-0 nova_compute[117514]: 2025-10-08 19:09:03.706 2 DEBUG oslo_concurrency.lockutils [req-bb5bf8da-77e7-4702-8d9c-8067d5dbb988 req-5178622a-22b5-4a5a-9970-9ec1fcbd811b bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "5f1c7c12-d16a-4158-9af6-e40d7ad01f2e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:09:03 compute-0 nova_compute[117514]: 2025-10-08 19:09:03.706 2 DEBUG oslo_concurrency.lockutils [req-bb5bf8da-77e7-4702-8d9c-8067d5dbb988 req-5178622a-22b5-4a5a-9970-9ec1fcbd811b bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "5f1c7c12-d16a-4158-9af6-e40d7ad01f2e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:09:03 compute-0 nova_compute[117514]: 2025-10-08 19:09:03.706 2 DEBUG nova.compute.manager [req-bb5bf8da-77e7-4702-8d9c-8067d5dbb988 req-5178622a-22b5-4a5a-9970-9ec1fcbd811b bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Processing event network-vif-plugged-a70af23b-d9f3-4d3e-96da-692ae05ba88a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Oct 08 19:09:03 compute-0 nova_compute[117514]: 2025-10-08 19:09:03.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:09:03 compute-0 nova_compute[117514]: 2025-10-08 19:09:03.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:09:04 compute-0 nova_compute[117514]: 2025-10-08 19:09:04.124 2 DEBUG nova.network.neutron [req-01ada483-92b7-4585-8c70-768b45246a77 req-3c44a16d-ae04-4a35-9db1-7126826df1b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Updated VIF entry in instance network info cache for port a70af23b-d9f3-4d3e-96da-692ae05ba88a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 08 19:09:04 compute-0 nova_compute[117514]: 2025-10-08 19:09:04.125 2 DEBUG nova.network.neutron [req-01ada483-92b7-4585-8c70-768b45246a77 req-3c44a16d-ae04-4a35-9db1-7126826df1b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Updating instance_info_cache with network_info: [{"id": "a70af23b-d9f3-4d3e-96da-692ae05ba88a", "address": "fa:16:3e:1c:aa:70", "network": {"id": "820a3a2e-47e5-4f6d-88d6-281476a31fb1", "bridge": "br-int", "label": "tempest-network-smoke--67383231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa70af23b-d9", "ovs_interfaceid": "a70af23b-d9f3-4d3e-96da-692ae05ba88a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:09:04 compute-0 nova_compute[117514]: 2025-10-08 19:09:04.138 2 DEBUG oslo_concurrency.lockutils [req-01ada483-92b7-4585-8c70-768b45246a77 req-3c44a16d-ae04-4a35-9db1-7126826df1b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-5f1c7c12-d16a-4158-9af6-e40d7ad01f2e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:09:05 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:05.109 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f81f7a-64d8-418a-a74c-b879bd6deb83, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.288 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950545.288375, 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 08 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.289 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] VM Started (Lifecycle Event)
Oct 08 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.292 2 DEBUG nova.compute.manager [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Oct 08 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.297 2 DEBUG nova.virt.libvirt.driver [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Oct 08 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.301 2 INFO nova.virt.libvirt.driver [-] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Instance spawned successfully.
Oct 08 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.302 2 DEBUG nova.virt.libvirt.driver [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Oct 08 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.309 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.313 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 08 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.325 2 DEBUG nova.virt.libvirt.driver [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.325 2 DEBUG nova.virt.libvirt.driver [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.326 2 DEBUG nova.virt.libvirt.driver [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.327 2 DEBUG nova.virt.libvirt.driver [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.328 2 DEBUG nova.virt.libvirt.driver [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.329 2 DEBUG nova.virt.libvirt.driver [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.338 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 08 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.339 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950545.2894785, 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 08 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.339 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] VM Paused (Lifecycle Event)
Oct 08 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.368 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.374 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950545.2960498, 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 08 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.375 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] VM Resumed (Lifecycle Event)
Oct 08 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.400 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.406 2 INFO nova.compute.manager [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Took 6.69 seconds to spawn the instance on the hypervisor.
Oct 08 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.407 2 DEBUG nova.compute.manager [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.412 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 08 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.447 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 08 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.488 2 INFO nova.compute.manager [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Took 7.28 seconds to build instance.
Oct 08 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.506 2 DEBUG oslo_concurrency.lockutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "5f1c7c12-d16a-4158-9af6-e40d7ad01f2e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.376s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.706 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.749 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.750 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.750 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.751 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 08 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.790 2 DEBUG nova.compute.manager [req-55246877-71c7-46b5-9836-319b4690b26d req-f280551e-d232-455f-9ccb-588eb5bdd096 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Received event network-vif-plugged-a70af23b-d9f3-4d3e-96da-692ae05ba88a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.791 2 DEBUG oslo_concurrency.lockutils [req-55246877-71c7-46b5-9836-319b4690b26d req-f280551e-d232-455f-9ccb-588eb5bdd096 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "5f1c7c12-d16a-4158-9af6-e40d7ad01f2e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.791 2 DEBUG oslo_concurrency.lockutils [req-55246877-71c7-46b5-9836-319b4690b26d req-f280551e-d232-455f-9ccb-588eb5bdd096 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "5f1c7c12-d16a-4158-9af6-e40d7ad01f2e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.792 2 DEBUG oslo_concurrency.lockutils [req-55246877-71c7-46b5-9836-319b4690b26d req-f280551e-d232-455f-9ccb-588eb5bdd096 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "5f1c7c12-d16a-4158-9af6-e40d7ad01f2e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.792 2 DEBUG nova.compute.manager [req-55246877-71c7-46b5-9836-319b4690b26d req-f280551e-d232-455f-9ccb-588eb5bdd096 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] No waiting events found dispatching network-vif-plugged-a70af23b-d9f3-4d3e-96da-692ae05ba88a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 08 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.793 2 WARNING nova.compute.manager [req-55246877-71c7-46b5-9836-319b4690b26d req-f280551e-d232-455f-9ccb-588eb5bdd096 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Received unexpected event network-vif-plugged-a70af23b-d9f3-4d3e-96da-692ae05ba88a for instance with vm_state active and task_state None.
Oct 08 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.845 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5f1c7c12-d16a-4158-9af6-e40d7ad01f2e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.936 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5f1c7c12-d16a-4158-9af6-e40d7ad01f2e/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.937 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5f1c7c12-d16a-4158-9af6-e40d7ad01f2e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:09:06 compute-0 nova_compute[117514]: 2025-10-08 19:09:06.006 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5f1c7c12-d16a-4158-9af6-e40d7ad01f2e/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:09:06 compute-0 nova_compute[117514]: 2025-10-08 19:09:06.011 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b81092db-79a9-4570-9579-4e100364515a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:09:06 compute-0 nova_compute[117514]: 2025-10-08 19:09:06.078 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b81092db-79a9-4570-9579-4e100364515a/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:09:06 compute-0 nova_compute[117514]: 2025-10-08 19:09:06.079 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b81092db-79a9-4570-9579-4e100364515a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:09:06 compute-0 nova_compute[117514]: 2025-10-08 19:09:06.166 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b81092db-79a9-4570-9579-4e100364515a/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:09:06 compute-0 nova_compute[117514]: 2025-10-08 19:09:06.355 2 WARNING nova.virt.libvirt.driver [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 08 19:09:06 compute-0 nova_compute[117514]: 2025-10-08 19:09:06.357 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5908MB free_disk=73.38648223876953GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 08 19:09:06 compute-0 nova_compute[117514]: 2025-10-08 19:09:06.358 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:09:06 compute-0 nova_compute[117514]: 2025-10-08 19:09:06.358 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:09:06 compute-0 nova_compute[117514]: 2025-10-08 19:09:06.437 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Instance b81092db-79a9-4570-9579-4e100364515a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 08 19:09:06 compute-0 nova_compute[117514]: 2025-10-08 19:09:06.438 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Instance 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 08 19:09:06 compute-0 nova_compute[117514]: 2025-10-08 19:09:06.439 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 08 19:09:06 compute-0 nova_compute[117514]: 2025-10-08 19:09:06.439 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 08 19:09:06 compute-0 nova_compute[117514]: 2025-10-08 19:09:06.505 2 DEBUG nova.compute.provider_tree [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 08 19:09:06 compute-0 nova_compute[117514]: 2025-10-08 19:09:06.522 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 08 19:09:06 compute-0 nova_compute[117514]: 2025-10-08 19:09:06.545 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 08 19:09:06 compute-0 nova_compute[117514]: 2025-10-08 19:09:06.546 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.188s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:09:07 compute-0 nova_compute[117514]: 2025-10-08 19:09:07.543 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:09:07 compute-0 nova_compute[117514]: 2025-10-08 19:09:07.544 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:09:07 compute-0 nova_compute[117514]: 2025-10-08 19:09:07.545 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 08 19:09:07 compute-0 nova_compute[117514]: 2025-10-08 19:09:07.545 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 08 19:09:07 compute-0 nova_compute[117514]: 2025-10-08 19:09:07.647 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:07 compute-0 nova_compute[117514]: 2025-10-08 19:09:07.706 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "refresh_cache-b81092db-79a9-4570-9579-4e100364515a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:09:07 compute-0 nova_compute[117514]: 2025-10-08 19:09:07.706 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquired lock "refresh_cache-b81092db-79a9-4570-9579-4e100364515a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:09:07 compute-0 nova_compute[117514]: 2025-10-08 19:09:07.706 2 DEBUG nova.network.neutron [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] [instance: b81092db-79a9-4570-9579-4e100364515a] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Oct 08 19:09:07 compute-0 nova_compute[117514]: 2025-10-08 19:09:07.707 2 DEBUG nova.objects.instance [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b81092db-79a9-4570-9579-4e100364515a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 08 19:09:07 compute-0 nova_compute[117514]: 2025-10-08 19:09:07.857 2 DEBUG nova.compute.manager [req-30f4ef18-5ee5-4a7c-9b63-d48047bfe8c5 req-01b95e2d-b09b-46f7-864e-36c4c34a8847 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Received event network-changed-a70af23b-d9f3-4d3e-96da-692ae05ba88a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:09:07 compute-0 nova_compute[117514]: 2025-10-08 19:09:07.858 2 DEBUG nova.compute.manager [req-30f4ef18-5ee5-4a7c-9b63-d48047bfe8c5 req-01b95e2d-b09b-46f7-864e-36c4c34a8847 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Refreshing instance network info cache due to event network-changed-a70af23b-d9f3-4d3e-96da-692ae05ba88a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 08 19:09:07 compute-0 nova_compute[117514]: 2025-10-08 19:09:07.859 2 DEBUG oslo_concurrency.lockutils [req-30f4ef18-5ee5-4a7c-9b63-d48047bfe8c5 req-01b95e2d-b09b-46f7-864e-36c4c34a8847 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-5f1c7c12-d16a-4158-9af6-e40d7ad01f2e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:09:07 compute-0 nova_compute[117514]: 2025-10-08 19:09:07.860 2 DEBUG oslo_concurrency.lockutils [req-30f4ef18-5ee5-4a7c-9b63-d48047bfe8c5 req-01b95e2d-b09b-46f7-864e-36c4c34a8847 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-5f1c7c12-d16a-4158-9af6-e40d7ad01f2e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:09:07 compute-0 nova_compute[117514]: 2025-10-08 19:09:07.860 2 DEBUG nova.network.neutron [req-30f4ef18-5ee5-4a7c-9b63-d48047bfe8c5 req-01b95e2d-b09b-46f7-864e-36c4c34a8847 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Refreshing network info cache for port a70af23b-d9f3-4d3e-96da-692ae05ba88a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 08 19:09:09 compute-0 podman[146886]: 2025-10-08 19:09:09.696706394 +0000 UTC m=+0.105360650 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 08 19:09:10 compute-0 nova_compute[117514]: 2025-10-08 19:09:10.760 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:11 compute-0 nova_compute[117514]: 2025-10-08 19:09:11.518 2 DEBUG nova.network.neutron [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] [instance: b81092db-79a9-4570-9579-4e100364515a] Updating instance_info_cache with network_info: [{"id": "4df96566-2548-47bc-bd48-095ff9ce5a25", "address": "fa:16:3e:f7:31:02", "network": {"id": "820a3a2e-47e5-4f6d-88d6-281476a31fb1", "bridge": "br-int", "label": "tempest-network-smoke--67383231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4df96566-25", "ovs_interfaceid": "4df96566-2548-47bc-bd48-095ff9ce5a25", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:09:11 compute-0 nova_compute[117514]: 2025-10-08 19:09:11.533 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Releasing lock "refresh_cache-b81092db-79a9-4570-9579-4e100364515a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:09:11 compute-0 nova_compute[117514]: 2025-10-08 19:09:11.534 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] [instance: b81092db-79a9-4570-9579-4e100364515a] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Oct 08 19:09:11 compute-0 nova_compute[117514]: 2025-10-08 19:09:11.534 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:09:11 compute-0 nova_compute[117514]: 2025-10-08 19:09:11.535 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:09:11 compute-0 nova_compute[117514]: 2025-10-08 19:09:11.535 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:09:11 compute-0 nova_compute[117514]: 2025-10-08 19:09:11.535 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:09:11 compute-0 nova_compute[117514]: 2025-10-08 19:09:11.535 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 08 19:09:11 compute-0 nova_compute[117514]: 2025-10-08 19:09:11.620 2 DEBUG nova.network.neutron [req-30f4ef18-5ee5-4a7c-9b63-d48047bfe8c5 req-01b95e2d-b09b-46f7-864e-36c4c34a8847 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Updated VIF entry in instance network info cache for port a70af23b-d9f3-4d3e-96da-692ae05ba88a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 08 19:09:11 compute-0 nova_compute[117514]: 2025-10-08 19:09:11.620 2 DEBUG nova.network.neutron [req-30f4ef18-5ee5-4a7c-9b63-d48047bfe8c5 req-01b95e2d-b09b-46f7-864e-36c4c34a8847 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Updating instance_info_cache with network_info: [{"id": "a70af23b-d9f3-4d3e-96da-692ae05ba88a", "address": "fa:16:3e:1c:aa:70", "network": {"id": "820a3a2e-47e5-4f6d-88d6-281476a31fb1", "bridge": "br-int", "label": "tempest-network-smoke--67383231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa70af23b-d9", "ovs_interfaceid": "a70af23b-d9f3-4d3e-96da-692ae05ba88a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:09:11 compute-0 nova_compute[117514]: 2025-10-08 19:09:11.639 2 DEBUG oslo_concurrency.lockutils [req-30f4ef18-5ee5-4a7c-9b63-d48047bfe8c5 req-01b95e2d-b09b-46f7-864e-36c4c34a8847 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-5f1c7c12-d16a-4158-9af6-e40d7ad01f2e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:09:12 compute-0 nova_compute[117514]: 2025-10-08 19:09:12.652 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:12 compute-0 nova_compute[117514]: 2025-10-08 19:09:12.703 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:09:15 compute-0 nova_compute[117514]: 2025-10-08 19:09:15.792 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:17 compute-0 ovn_controller[19759]: 2025-10-08T19:09:17Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:1c:aa:70 10.100.0.9
Oct 08 19:09:17 compute-0 ovn_controller[19759]: 2025-10-08T19:09:17Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1c:aa:70 10.100.0.9
Oct 08 19:09:17 compute-0 nova_compute[117514]: 2025-10-08 19:09:17.655 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:19 compute-0 podman[146923]: 2025-10-08 19:09:19.650331707 +0000 UTC m=+0.070257445 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 08 19:09:20 compute-0 nova_compute[117514]: 2025-10-08 19:09:20.793 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:22 compute-0 nova_compute[117514]: 2025-10-08 19:09:22.658 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:23 compute-0 nova_compute[117514]: 2025-10-08 19:09:23.833 2 INFO nova.compute.manager [None req-42783f33-f50f-4406-9f02-e4b4c7a44b55 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Get console output
Oct 08 19:09:23 compute-0 nova_compute[117514]: 2025-10-08 19:09:23.839 54 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Oct 08 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.187 2 DEBUG oslo_concurrency.lockutils [None req-29c45a55-6be9-4b5f-8df4-3c763fe14ce1 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "5f1c7c12-d16a-4158-9af6-e40d7ad01f2e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.187 2 DEBUG oslo_concurrency.lockutils [None req-29c45a55-6be9-4b5f-8df4-3c763fe14ce1 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "5f1c7c12-d16a-4158-9af6-e40d7ad01f2e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.188 2 DEBUG oslo_concurrency.lockutils [None req-29c45a55-6be9-4b5f-8df4-3c763fe14ce1 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "5f1c7c12-d16a-4158-9af6-e40d7ad01f2e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.188 2 DEBUG oslo_concurrency.lockutils [None req-29c45a55-6be9-4b5f-8df4-3c763fe14ce1 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "5f1c7c12-d16a-4158-9af6-e40d7ad01f2e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.188 2 DEBUG oslo_concurrency.lockutils [None req-29c45a55-6be9-4b5f-8df4-3c763fe14ce1 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "5f1c7c12-d16a-4158-9af6-e40d7ad01f2e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.190 2 INFO nova.compute.manager [None req-29c45a55-6be9-4b5f-8df4-3c763fe14ce1 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Terminating instance
Oct 08 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.191 2 DEBUG nova.compute.manager [None req-29c45a55-6be9-4b5f-8df4-3c763fe14ce1 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Oct 08 19:09:24 compute-0 kernel: tapa70af23b-d9 (unregistering): left promiscuous mode
Oct 08 19:09:24 compute-0 NetworkManager[1035]: <info>  [1759950564.2263] device (tapa70af23b-d9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 08 19:09:24 compute-0 ovn_controller[19759]: 2025-10-08T19:09:24Z|00076|binding|INFO|Releasing lport a70af23b-d9f3-4d3e-96da-692ae05ba88a from this chassis (sb_readonly=0)
Oct 08 19:09:24 compute-0 ovn_controller[19759]: 2025-10-08T19:09:24Z|00077|binding|INFO|Setting lport a70af23b-d9f3-4d3e-96da-692ae05ba88a down in Southbound
Oct 08 19:09:24 compute-0 ovn_controller[19759]: 2025-10-08T19:09:24Z|00078|binding|INFO|Removing iface tapa70af23b-d9 ovn-installed in OVS
Oct 08 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.244 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:24 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:24.249 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1c:aa:70 10.100.0.9'], port_security=['fa:16:3e:1c:aa:70 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '5f1c7c12-d16a-4158-9af6-e40d7ad01f2e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-820a3a2e-47e5-4f6d-88d6-281476a31fb1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '4', 'neutron:security_group_ids': '02048498-a771-4306-8e83-ef79600f50a0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.212'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b8f7e04c-5c12-4776-b9f7-f4835ede26c3, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=a70af23b-d9f3-4d3e-96da-692ae05ba88a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 08 19:09:24 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:24.250 28643 INFO neutron.agent.ovn.metadata.agent [-] Port a70af23b-d9f3-4d3e-96da-692ae05ba88a in datapath 820a3a2e-47e5-4f6d-88d6-281476a31fb1 unbound from our chassis
Oct 08 19:09:24 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:24.251 28643 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 820a3a2e-47e5-4f6d-88d6-281476a31fb1
Oct 08 19:09:24 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:24.267 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[38340c45-21a4-4974-8311-9077915692b1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:24 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:24.288 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[38d8a6dd-f475-4afb-9f0c-c70bd48cf91c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:09:24 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:24.290 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[33b83c75-569c-4cb6-9eb1-1435c234431d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:09:24 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000005.scope: Deactivated successfully.
Oct 08 19:09:24 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000005.scope: Consumed 13.385s CPU time.
Oct 08 19:09:24 compute-0 systemd-machined[77568]: Machine qemu-5-instance-00000005 terminated.
Oct 08 19:09:24 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:24.310 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[6b3921a2-7291-4820-b23d-6728c569e13d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:09:24 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:24.341 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[fc9fc569-5ee7-454e-8a09-cf55f38e5184]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap820a3a2e-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fc:c1:bb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 7, 'rx_bytes': 1000, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 7, 'rx_bytes': 1000, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 118031, 'reachable_time': 41703, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 146954, 'error': None, 'target': 'ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:09:24 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:24.361 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[b5e3f4a5-5ffc-4bc5-a2a4-d9455d6f5537]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap820a3a2e-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 118046, 'tstamp': 118046}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 146955, 'error': None, 'target': 'ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap820a3a2e-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 118049, 'tstamp': 118049}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 146955, 'error': None, 'target': 'ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:09:24 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:24.363 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap820a3a2e-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.368 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:24 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:24.369 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap820a3a2e-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:09:24 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:24.369 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 08 19:09:24 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:24.369 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap820a3a2e-40, col_values=(('external_ids', {'iface-id': '9e4e54fa-32ec-4ece-b34d-e4e72c958a54'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:09:24 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:24.369 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 08 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.470 2 INFO nova.virt.libvirt.driver [-] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Instance destroyed successfully.
Oct 08 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.470 2 DEBUG nova.objects.instance [None req-29c45a55-6be9-4b5f-8df4-3c763fe14ce1 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'resources' on Instance uuid 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 08 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.490 2 DEBUG nova.virt.libvirt.vif [None req-29c45a55-6be9-4b5f-8df4-3c763fe14ce1 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T19:08:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2036009954',display_name='tempest-TestNetworkBasicOps-server-2036009954',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2036009954',id=5,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIaQYsS3p/kqkwhPUSySAXSbxOdURRwycGYz8zG+mSEb7vM+V8TI/DnRVmOc+q/Hcp4ljBTmVN8Dn0Fwxkk8IhqlYVJKZ25JiPY8aDaNHw2HT5FEQjUWsRu8yiFEP7RRtA==',key_name='tempest-TestNetworkBasicOps-1573009253',keypairs=<?>,launch_index=0,launched_at=2025-10-08T19:09:05Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-4zycm9c0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T19:09:05Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=5f1c7c12-d16a-4158-9af6-e40d7ad01f2e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a70af23b-d9f3-4d3e-96da-692ae05ba88a", "address": "fa:16:3e:1c:aa:70", "network": {"id": "820a3a2e-47e5-4f6d-88d6-281476a31fb1", "bridge": "br-int", "label": "tempest-network-smoke--67383231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa70af23b-d9", "ovs_interfaceid": "a70af23b-d9f3-4d3e-96da-692ae05ba88a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Oct 08 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.490 2 DEBUG nova.network.os_vif_util [None req-29c45a55-6be9-4b5f-8df4-3c763fe14ce1 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "a70af23b-d9f3-4d3e-96da-692ae05ba88a", "address": "fa:16:3e:1c:aa:70", "network": {"id": "820a3a2e-47e5-4f6d-88d6-281476a31fb1", "bridge": "br-int", "label": "tempest-network-smoke--67383231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa70af23b-d9", "ovs_interfaceid": "a70af23b-d9f3-4d3e-96da-692ae05ba88a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 08 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.491 2 DEBUG nova.network.os_vif_util [None req-29c45a55-6be9-4b5f-8df4-3c763fe14ce1 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1c:aa:70,bridge_name='br-int',has_traffic_filtering=True,id=a70af23b-d9f3-4d3e-96da-692ae05ba88a,network=Network(820a3a2e-47e5-4f6d-88d6-281476a31fb1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa70af23b-d9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 08 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.492 2 DEBUG os_vif [None req-29c45a55-6be9-4b5f-8df4-3c763fe14ce1 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1c:aa:70,bridge_name='br-int',has_traffic_filtering=True,id=a70af23b-d9f3-4d3e-96da-692ae05ba88a,network=Network(820a3a2e-47e5-4f6d-88d6-281476a31fb1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa70af23b-d9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Oct 08 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.494 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa70af23b-d9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.497 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.500 2 INFO os_vif [None req-29c45a55-6be9-4b5f-8df4-3c763fe14ce1 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1c:aa:70,bridge_name='br-int',has_traffic_filtering=True,id=a70af23b-d9f3-4d3e-96da-692ae05ba88a,network=Network(820a3a2e-47e5-4f6d-88d6-281476a31fb1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa70af23b-d9')
Oct 08 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.501 2 INFO nova.virt.libvirt.driver [None req-29c45a55-6be9-4b5f-8df4-3c763fe14ce1 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Deleting instance files /var/lib/nova/instances/5f1c7c12-d16a-4158-9af6-e40d7ad01f2e_del
Oct 08 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.502 2 INFO nova.virt.libvirt.driver [None req-29c45a55-6be9-4b5f-8df4-3c763fe14ce1 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Deletion of /var/lib/nova/instances/5f1c7c12-d16a-4158-9af6-e40d7ad01f2e_del complete
Oct 08 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.552 2 INFO nova.compute.manager [None req-29c45a55-6be9-4b5f-8df4-3c763fe14ce1 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Took 0.36 seconds to destroy the instance on the hypervisor.
Oct 08 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.553 2 DEBUG oslo.service.loopingcall [None req-29c45a55-6be9-4b5f-8df4-3c763fe14ce1 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Oct 08 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.553 2 DEBUG nova.compute.manager [-] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Oct 08 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.553 2 DEBUG nova.network.neutron [-] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Oct 08 19:09:25 compute-0 nova_compute[117514]: 2025-10-08 19:09:25.316 2 DEBUG nova.compute.manager [req-b9d21aea-26e2-474d-bc90-eb2091a5bd31 req-f5f6fb9c-5c0c-4b13-aed2-7a6effe0b3a6 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Received event network-vif-unplugged-a70af23b-d9f3-4d3e-96da-692ae05ba88a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:09:25 compute-0 nova_compute[117514]: 2025-10-08 19:09:25.316 2 DEBUG oslo_concurrency.lockutils [req-b9d21aea-26e2-474d-bc90-eb2091a5bd31 req-f5f6fb9c-5c0c-4b13-aed2-7a6effe0b3a6 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "5f1c7c12-d16a-4158-9af6-e40d7ad01f2e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:09:25 compute-0 nova_compute[117514]: 2025-10-08 19:09:25.317 2 DEBUG oslo_concurrency.lockutils [req-b9d21aea-26e2-474d-bc90-eb2091a5bd31 req-f5f6fb9c-5c0c-4b13-aed2-7a6effe0b3a6 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "5f1c7c12-d16a-4158-9af6-e40d7ad01f2e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:09:25 compute-0 nova_compute[117514]: 2025-10-08 19:09:25.317 2 DEBUG oslo_concurrency.lockutils [req-b9d21aea-26e2-474d-bc90-eb2091a5bd31 req-f5f6fb9c-5c0c-4b13-aed2-7a6effe0b3a6 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "5f1c7c12-d16a-4158-9af6-e40d7ad01f2e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:09:25 compute-0 nova_compute[117514]: 2025-10-08 19:09:25.318 2 DEBUG nova.compute.manager [req-b9d21aea-26e2-474d-bc90-eb2091a5bd31 req-f5f6fb9c-5c0c-4b13-aed2-7a6effe0b3a6 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] No waiting events found dispatching network-vif-unplugged-a70af23b-d9f3-4d3e-96da-692ae05ba88a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 08 19:09:25 compute-0 nova_compute[117514]: 2025-10-08 19:09:25.319 2 DEBUG nova.compute.manager [req-b9d21aea-26e2-474d-bc90-eb2091a5bd31 req-f5f6fb9c-5c0c-4b13-aed2-7a6effe0b3a6 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Received event network-vif-unplugged-a70af23b-d9f3-4d3e-96da-692ae05ba88a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Oct 08 19:09:25 compute-0 podman[146979]: 2025-10-08 19:09:25.65175334 +0000 UTC m=+0.066717241 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct 08 19:09:25 compute-0 podman[146978]: 2025-10-08 19:09:25.689347979 +0000 UTC m=+0.100684574 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, managed_by=edpm_ansible, io.openshift.expose-services=, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.openshift.tags=minimal rhel9, release=1755695350, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter)
Oct 08 19:09:25 compute-0 nova_compute[117514]: 2025-10-08 19:09:25.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:27 compute-0 nova_compute[117514]: 2025-10-08 19:09:27.266 2 DEBUG nova.network.neutron [-] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:09:27 compute-0 nova_compute[117514]: 2025-10-08 19:09:27.283 2 INFO nova.compute.manager [-] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Took 2.73 seconds to deallocate network for instance.
Oct 08 19:09:27 compute-0 nova_compute[117514]: 2025-10-08 19:09:27.342 2 DEBUG oslo_concurrency.lockutils [None req-29c45a55-6be9-4b5f-8df4-3c763fe14ce1 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:09:27 compute-0 nova_compute[117514]: 2025-10-08 19:09:27.343 2 DEBUG oslo_concurrency.lockutils [None req-29c45a55-6be9-4b5f-8df4-3c763fe14ce1 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:09:27 compute-0 nova_compute[117514]: 2025-10-08 19:09:27.352 2 DEBUG nova.compute.manager [req-7ac36a26-bfb8-46c7-8e37-412c308c1da5 req-79acea6d-79a9-4bfd-a053-7c8b09f59ffd bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Received event network-vif-deleted-a70af23b-d9f3-4d3e-96da-692ae05ba88a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:09:27 compute-0 nova_compute[117514]: 2025-10-08 19:09:27.413 2 DEBUG nova.compute.manager [req-733ea0c2-6dbc-4a08-95c0-ac5b7c146550 req-f4690235-3128-4f44-b446-d55f3509f0f2 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Received event network-vif-plugged-a70af23b-d9f3-4d3e-96da-692ae05ba88a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:09:27 compute-0 nova_compute[117514]: 2025-10-08 19:09:27.414 2 DEBUG oslo_concurrency.lockutils [req-733ea0c2-6dbc-4a08-95c0-ac5b7c146550 req-f4690235-3128-4f44-b446-d55f3509f0f2 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "5f1c7c12-d16a-4158-9af6-e40d7ad01f2e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:09:27 compute-0 nova_compute[117514]: 2025-10-08 19:09:27.414 2 DEBUG oslo_concurrency.lockutils [req-733ea0c2-6dbc-4a08-95c0-ac5b7c146550 req-f4690235-3128-4f44-b446-d55f3509f0f2 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "5f1c7c12-d16a-4158-9af6-e40d7ad01f2e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:09:27 compute-0 nova_compute[117514]: 2025-10-08 19:09:27.414 2 DEBUG oslo_concurrency.lockutils [req-733ea0c2-6dbc-4a08-95c0-ac5b7c146550 req-f4690235-3128-4f44-b446-d55f3509f0f2 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "5f1c7c12-d16a-4158-9af6-e40d7ad01f2e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:09:27 compute-0 nova_compute[117514]: 2025-10-08 19:09:27.414 2 DEBUG nova.compute.manager [req-733ea0c2-6dbc-4a08-95c0-ac5b7c146550 req-f4690235-3128-4f44-b446-d55f3509f0f2 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] No waiting events found dispatching network-vif-plugged-a70af23b-d9f3-4d3e-96da-692ae05ba88a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 08 19:09:27 compute-0 nova_compute[117514]: 2025-10-08 19:09:27.414 2 WARNING nova.compute.manager [req-733ea0c2-6dbc-4a08-95c0-ac5b7c146550 req-f4690235-3128-4f44-b446-d55f3509f0f2 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Received unexpected event network-vif-plugged-a70af23b-d9f3-4d3e-96da-692ae05ba88a for instance with vm_state deleted and task_state None.
Oct 08 19:09:27 compute-0 nova_compute[117514]: 2025-10-08 19:09:27.420 2 DEBUG nova.compute.provider_tree [None req-29c45a55-6be9-4b5f-8df4-3c763fe14ce1 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 08 19:09:27 compute-0 nova_compute[117514]: 2025-10-08 19:09:27.435 2 DEBUG nova.scheduler.client.report [None req-29c45a55-6be9-4b5f-8df4-3c763fe14ce1 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 08 19:09:27 compute-0 nova_compute[117514]: 2025-10-08 19:09:27.457 2 DEBUG oslo_concurrency.lockutils [None req-29c45a55-6be9-4b5f-8df4-3c763fe14ce1 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:09:27 compute-0 nova_compute[117514]: 2025-10-08 19:09:27.481 2 INFO nova.scheduler.client.report [None req-29c45a55-6be9-4b5f-8df4-3c763fe14ce1 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Deleted allocations for instance 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e
Oct 08 19:09:27 compute-0 nova_compute[117514]: 2025-10-08 19:09:27.559 2 DEBUG oslo_concurrency.lockutils [None req-29c45a55-6be9-4b5f-8df4-3c763fe14ce1 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "5f1c7c12-d16a-4158-9af6-e40d7ad01f2e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.372s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:09:29 compute-0 nova_compute[117514]: 2025-10-08 19:09:29.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:29 compute-0 podman[147018]: 2025-10-08 19:09:29.663832139 +0000 UTC m=+0.081529223 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 08 19:09:29 compute-0 nova_compute[117514]: 2025-10-08 19:09:29.699 2 DEBUG oslo_concurrency.lockutils [None req-87400f29-3e78-4b7c-ac51-6279a9b9a271 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "b81092db-79a9-4570-9579-4e100364515a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:09:29 compute-0 nova_compute[117514]: 2025-10-08 19:09:29.699 2 DEBUG oslo_concurrency.lockutils [None req-87400f29-3e78-4b7c-ac51-6279a9b9a271 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "b81092db-79a9-4570-9579-4e100364515a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:09:29 compute-0 nova_compute[117514]: 2025-10-08 19:09:29.699 2 DEBUG oslo_concurrency.lockutils [None req-87400f29-3e78-4b7c-ac51-6279a9b9a271 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "b81092db-79a9-4570-9579-4e100364515a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:09:29 compute-0 nova_compute[117514]: 2025-10-08 19:09:29.699 2 DEBUG oslo_concurrency.lockutils [None req-87400f29-3e78-4b7c-ac51-6279a9b9a271 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "b81092db-79a9-4570-9579-4e100364515a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:09:29 compute-0 nova_compute[117514]: 2025-10-08 19:09:29.700 2 DEBUG oslo_concurrency.lockutils [None req-87400f29-3e78-4b7c-ac51-6279a9b9a271 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "b81092db-79a9-4570-9579-4e100364515a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:09:29 compute-0 nova_compute[117514]: 2025-10-08 19:09:29.701 2 INFO nova.compute.manager [None req-87400f29-3e78-4b7c-ac51-6279a9b9a271 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Terminating instance
Oct 08 19:09:29 compute-0 nova_compute[117514]: 2025-10-08 19:09:29.702 2 DEBUG nova.compute.manager [None req-87400f29-3e78-4b7c-ac51-6279a9b9a271 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Oct 08 19:09:29 compute-0 kernel: tap4df96566-25 (unregistering): left promiscuous mode
Oct 08 19:09:29 compute-0 NetworkManager[1035]: <info>  [1759950569.7353] device (tap4df96566-25): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 08 19:09:29 compute-0 nova_compute[117514]: 2025-10-08 19:09:29.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:29 compute-0 ovn_controller[19759]: 2025-10-08T19:09:29Z|00079|binding|INFO|Releasing lport 4df96566-2548-47bc-bd48-095ff9ce5a25 from this chassis (sb_readonly=0)
Oct 08 19:09:29 compute-0 ovn_controller[19759]: 2025-10-08T19:09:29Z|00080|binding|INFO|Setting lport 4df96566-2548-47bc-bd48-095ff9ce5a25 down in Southbound
Oct 08 19:09:29 compute-0 ovn_controller[19759]: 2025-10-08T19:09:29Z|00081|binding|INFO|Removing iface tap4df96566-25 ovn-installed in OVS
Oct 08 19:09:29 compute-0 nova_compute[117514]: 2025-10-08 19:09:29.749 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:29.756 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f7:31:02 10.100.0.4'], port_security=['fa:16:3e:f7:31:02 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'b81092db-79a9-4570-9579-4e100364515a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-820a3a2e-47e5-4f6d-88d6-281476a31fb1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b3c14dd0-3cf2-41c1-9115-bc2ef0b741ba', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b8f7e04c-5c12-4776-b9f7-f4835ede26c3, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=4df96566-2548-47bc-bd48-095ff9ce5a25) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 08 19:09:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:29.758 28643 INFO neutron.agent.ovn.metadata.agent [-] Port 4df96566-2548-47bc-bd48-095ff9ce5a25 in datapath 820a3a2e-47e5-4f6d-88d6-281476a31fb1 unbound from our chassis
Oct 08 19:09:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:29.759 28643 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 820a3a2e-47e5-4f6d-88d6-281476a31fb1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 08 19:09:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:29.761 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[2153d278-f662-4958-8a3e-19f8fded96a5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:09:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:29.762 28643 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1 namespace which is not needed anymore
Oct 08 19:09:29 compute-0 nova_compute[117514]: 2025-10-08 19:09:29.780 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:29 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000004.scope: Deactivated successfully.
Oct 08 19:09:29 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000004.scope: Consumed 16.423s CPU time.
Oct 08 19:09:29 compute-0 systemd-machined[77568]: Machine qemu-4-instance-00000004 terminated.
Oct 08 19:09:29 compute-0 neutron-haproxy-ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1[146594]: [NOTICE]   (146614) : haproxy version is 2.8.14-c23fe91
Oct 08 19:09:29 compute-0 neutron-haproxy-ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1[146594]: [NOTICE]   (146614) : path to executable is /usr/sbin/haproxy
Oct 08 19:09:29 compute-0 neutron-haproxy-ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1[146594]: [WARNING]  (146614) : Exiting Master process...
Oct 08 19:09:29 compute-0 neutron-haproxy-ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1[146594]: [ALERT]    (146614) : Current worker (146616) exited with code 143 (Terminated)
Oct 08 19:09:29 compute-0 neutron-haproxy-ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1[146594]: [WARNING]  (146614) : All workers exited. Exiting... (0)
Oct 08 19:09:29 compute-0 systemd[1]: libpod-fb07793201da9ab1609a4f4565bfd293a68536af438cb3b77cd79566b9425f07.scope: Deactivated successfully.
Oct 08 19:09:29 compute-0 podman[147066]: 2025-10-08 19:09:29.958498842 +0000 UTC m=+0.068776461 container died fb07793201da9ab1609a4f4565bfd293a68536af438cb3b77cd79566b9425f07 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct 08 19:09:29 compute-0 nova_compute[117514]: 2025-10-08 19:09:29.983 2 INFO nova.virt.libvirt.driver [-] [instance: b81092db-79a9-4570-9579-4e100364515a] Instance destroyed successfully.
Oct 08 19:09:29 compute-0 nova_compute[117514]: 2025-10-08 19:09:29.983 2 DEBUG nova.objects.instance [None req-87400f29-3e78-4b7c-ac51-6279a9b9a271 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'resources' on Instance uuid b81092db-79a9-4570-9579-4e100364515a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 08 19:09:29 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fb07793201da9ab1609a4f4565bfd293a68536af438cb3b77cd79566b9425f07-userdata-shm.mount: Deactivated successfully.
Oct 08 19:09:29 compute-0 systemd[1]: var-lib-containers-storage-overlay-58ea3c57cf88dfce89603da495e87146b2bb91b27139ea83f58571fbcf3d370c-merged.mount: Deactivated successfully.
Oct 08 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.001 2 DEBUG nova.virt.libvirt.vif [None req-87400f29-3e78-4b7c-ac51-6279a9b9a271 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T19:08:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1923358122',display_name='tempest-TestNetworkBasicOps-server-1923358122',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1923358122',id=4,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLQLVJLI0B1DuDHRr0xZejVz519BcFo77SQm/iU8QOSD6bvHcTPIzjucvYocQDiXeDjzdepuMi6T99yqrAkyTWA86BuQoBq3ywvQZ7i+b1z4o3zuHDlJxNAK8zAsugXiSA==',key_name='tempest-TestNetworkBasicOps-993932891',keypairs=<?>,launch_index=0,launched_at=2025-10-08T19:08:30Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-bunw0mg3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T19:08:30Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=b81092db-79a9-4570-9579-4e100364515a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4df96566-2548-47bc-bd48-095ff9ce5a25", "address": "fa:16:3e:f7:31:02", "network": {"id": "820a3a2e-47e5-4f6d-88d6-281476a31fb1", "bridge": "br-int", "label": "tempest-network-smoke--67383231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4df96566-25", "ovs_interfaceid": "4df96566-2548-47bc-bd48-095ff9ce5a25", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Oct 08 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.002 2 DEBUG nova.network.os_vif_util [None req-87400f29-3e78-4b7c-ac51-6279a9b9a271 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "4df96566-2548-47bc-bd48-095ff9ce5a25", "address": "fa:16:3e:f7:31:02", "network": {"id": "820a3a2e-47e5-4f6d-88d6-281476a31fb1", "bridge": "br-int", "label": "tempest-network-smoke--67383231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4df96566-25", "ovs_interfaceid": "4df96566-2548-47bc-bd48-095ff9ce5a25", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 08 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.003 2 DEBUG nova.network.os_vif_util [None req-87400f29-3e78-4b7c-ac51-6279a9b9a271 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f7:31:02,bridge_name='br-int',has_traffic_filtering=True,id=4df96566-2548-47bc-bd48-095ff9ce5a25,network=Network(820a3a2e-47e5-4f6d-88d6-281476a31fb1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4df96566-25') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 08 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.003 2 DEBUG os_vif [None req-87400f29-3e78-4b7c-ac51-6279a9b9a271 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f7:31:02,bridge_name='br-int',has_traffic_filtering=True,id=4df96566-2548-47bc-bd48-095ff9ce5a25,network=Network(820a3a2e-47e5-4f6d-88d6-281476a31fb1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4df96566-25') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Oct 08 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.005 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.005 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4df96566-25, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.010 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 08 19:09:30 compute-0 podman[147066]: 2025-10-08 19:09:30.010810501 +0000 UTC m=+0.121088090 container cleanup fb07793201da9ab1609a4f4565bfd293a68536af438cb3b77cd79566b9425f07 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct 08 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.016 2 INFO os_vif [None req-87400f29-3e78-4b7c-ac51-6279a9b9a271 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f7:31:02,bridge_name='br-int',has_traffic_filtering=True,id=4df96566-2548-47bc-bd48-095ff9ce5a25,network=Network(820a3a2e-47e5-4f6d-88d6-281476a31fb1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4df96566-25')
Oct 08 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.017 2 INFO nova.virt.libvirt.driver [None req-87400f29-3e78-4b7c-ac51-6279a9b9a271 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Deleting instance files /var/lib/nova/instances/b81092db-79a9-4570-9579-4e100364515a_del
Oct 08 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.018 2 INFO nova.virt.libvirt.driver [None req-87400f29-3e78-4b7c-ac51-6279a9b9a271 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Deletion of /var/lib/nova/instances/b81092db-79a9-4570-9579-4e100364515a_del complete
Oct 08 19:09:30 compute-0 systemd[1]: libpod-conmon-fb07793201da9ab1609a4f4565bfd293a68536af438cb3b77cd79566b9425f07.scope: Deactivated successfully.
Oct 08 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.078 2 INFO nova.compute.manager [None req-87400f29-3e78-4b7c-ac51-6279a9b9a271 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Took 0.38 seconds to destroy the instance on the hypervisor.
Oct 08 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.079 2 DEBUG oslo.service.loopingcall [None req-87400f29-3e78-4b7c-ac51-6279a9b9a271 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Oct 08 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.079 2 DEBUG nova.compute.manager [-] [instance: b81092db-79a9-4570-9579-4e100364515a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Oct 08 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.079 2 DEBUG nova.network.neutron [-] [instance: b81092db-79a9-4570-9579-4e100364515a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Oct 08 19:09:30 compute-0 podman[147119]: 2025-10-08 19:09:30.115505611 +0000 UTC m=+0.067805973 container remove fb07793201da9ab1609a4f4565bfd293a68536af438cb3b77cd79566b9425f07 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct 08 19:09:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:30.125 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[b193fa4d-337e-4382-bc30-af9cf9275211]: (4, ('Wed Oct  8 07:09:29 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1 (fb07793201da9ab1609a4f4565bfd293a68536af438cb3b77cd79566b9425f07)\nfb07793201da9ab1609a4f4565bfd293a68536af438cb3b77cd79566b9425f07\nWed Oct  8 07:09:30 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1 (fb07793201da9ab1609a4f4565bfd293a68536af438cb3b77cd79566b9425f07)\nfb07793201da9ab1609a4f4565bfd293a68536af438cb3b77cd79566b9425f07\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:09:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:30.128 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[ef8c121c-1584-49f2-83d2-759b49650937]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:09:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:30.129 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap820a3a2e-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.133 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:30 compute-0 kernel: tap820a3a2e-40: left promiscuous mode
Oct 08 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.157 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:30.162 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[b9622e3a-9f98-4735-9712-02f5b16a4c31]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:09:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:30.196 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[eb4ba426-eea9-40b7-9903-c9a9c61ef927]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:09:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:30.198 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[d5334e2b-88b3-49c1-b531-502570add548]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:09:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:30.224 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[a191e736-c1ca-463e-9a70-edb7af8cdef7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 118022, 'reachable_time': 41150, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 147132, 'error': None, 'target': 'ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:09:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:30.226 28783 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Oct 08 19:09:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:30.227 28783 DEBUG oslo.privsep.daemon [-] privsep: reply[2cce2644-7475-4659-b20d-3d320ba47048]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:09:30 compute-0 systemd[1]: run-netns-ovnmeta\x2d820a3a2e\x2d47e5\x2d4f6d\x2d88d6\x2d281476a31fb1.mount: Deactivated successfully.
Oct 08 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.268 2 DEBUG nova.compute.manager [req-c97f6596-a163-415a-ad01-1e59ab466d22 req-5f35c814-d439-4682-8b3c-ad3a16926199 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Received event network-vif-unplugged-4df96566-2548-47bc-bd48-095ff9ce5a25 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.269 2 DEBUG oslo_concurrency.lockutils [req-c97f6596-a163-415a-ad01-1e59ab466d22 req-5f35c814-d439-4682-8b3c-ad3a16926199 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "b81092db-79a9-4570-9579-4e100364515a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.273 2 DEBUG oslo_concurrency.lockutils [req-c97f6596-a163-415a-ad01-1e59ab466d22 req-5f35c814-d439-4682-8b3c-ad3a16926199 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "b81092db-79a9-4570-9579-4e100364515a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.274 2 DEBUG oslo_concurrency.lockutils [req-c97f6596-a163-415a-ad01-1e59ab466d22 req-5f35c814-d439-4682-8b3c-ad3a16926199 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "b81092db-79a9-4570-9579-4e100364515a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.274 2 DEBUG nova.compute.manager [req-c97f6596-a163-415a-ad01-1e59ab466d22 req-5f35c814-d439-4682-8b3c-ad3a16926199 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] No waiting events found dispatching network-vif-unplugged-4df96566-2548-47bc-bd48-095ff9ce5a25 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 08 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.274 2 DEBUG nova.compute.manager [req-c97f6596-a163-415a-ad01-1e59ab466d22 req-5f35c814-d439-4682-8b3c-ad3a16926199 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Received event network-vif-unplugged-4df96566-2548-47bc-bd48-095ff9ce5a25 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Oct 08 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.787 2 DEBUG nova.network.neutron [-] [instance: b81092db-79a9-4570-9579-4e100364515a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.797 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.809 2 INFO nova.compute.manager [-] [instance: b81092db-79a9-4570-9579-4e100364515a] Took 0.73 seconds to deallocate network for instance.
Oct 08 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.840 2 DEBUG nova.compute.manager [req-fe788ad6-489d-4f84-9c34-70273116173c req-63be88b9-3fd5-4055-aec9-3b7bb6ebf3d9 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Received event network-vif-deleted-4df96566-2548-47bc-bd48-095ff9ce5a25 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.862 2 DEBUG oslo_concurrency.lockutils [None req-87400f29-3e78-4b7c-ac51-6279a9b9a271 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.863 2 DEBUG oslo_concurrency.lockutils [None req-87400f29-3e78-4b7c-ac51-6279a9b9a271 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.906 2 DEBUG nova.compute.provider_tree [None req-87400f29-3e78-4b7c-ac51-6279a9b9a271 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 08 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.930 2 DEBUG nova.scheduler.client.report [None req-87400f29-3e78-4b7c-ac51-6279a9b9a271 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 08 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.959 2 DEBUG oslo_concurrency.lockutils [None req-87400f29-3e78-4b7c-ac51-6279a9b9a271 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.096s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.996 2 INFO nova.scheduler.client.report [None req-87400f29-3e78-4b7c-ac51-6279a9b9a271 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Deleted allocations for instance b81092db-79a9-4570-9579-4e100364515a
Oct 08 19:09:31 compute-0 nova_compute[117514]: 2025-10-08 19:09:31.082 2 DEBUG oslo_concurrency.lockutils [None req-87400f29-3e78-4b7c-ac51-6279a9b9a271 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "b81092db-79a9-4570-9579-4e100364515a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.383s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:09:31 compute-0 podman[147135]: 2025-10-08 19:09:31.673008681 +0000 UTC m=+0.079252618 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 08 19:09:31 compute-0 podman[147137]: 2025-10-08 19:09:31.68221381 +0000 UTC m=+0.079161455 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 08 19:09:31 compute-0 podman[147136]: 2025-10-08 19:09:31.756192012 +0000 UTC m=+0.153976581 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001)
Oct 08 19:09:32 compute-0 unix_chkpwd[147198]: password check failed for user (root)
Oct 08 19:09:32 compute-0 sshd-session[147133]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.103  user=root
Oct 08 19:09:32 compute-0 nova_compute[117514]: 2025-10-08 19:09:32.349 2 DEBUG nova.compute.manager [req-a744c4f7-4e52-43c7-9e58-dc8644ed9901 req-442b3477-e3a0-4072-9d2c-2ed9f0d101df bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Received event network-vif-plugged-4df96566-2548-47bc-bd48-095ff9ce5a25 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:09:32 compute-0 nova_compute[117514]: 2025-10-08 19:09:32.350 2 DEBUG oslo_concurrency.lockutils [req-a744c4f7-4e52-43c7-9e58-dc8644ed9901 req-442b3477-e3a0-4072-9d2c-2ed9f0d101df bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "b81092db-79a9-4570-9579-4e100364515a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:09:32 compute-0 nova_compute[117514]: 2025-10-08 19:09:32.350 2 DEBUG oslo_concurrency.lockutils [req-a744c4f7-4e52-43c7-9e58-dc8644ed9901 req-442b3477-e3a0-4072-9d2c-2ed9f0d101df bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "b81092db-79a9-4570-9579-4e100364515a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:09:32 compute-0 nova_compute[117514]: 2025-10-08 19:09:32.350 2 DEBUG oslo_concurrency.lockutils [req-a744c4f7-4e52-43c7-9e58-dc8644ed9901 req-442b3477-e3a0-4072-9d2c-2ed9f0d101df bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "b81092db-79a9-4570-9579-4e100364515a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:09:32 compute-0 nova_compute[117514]: 2025-10-08 19:09:32.350 2 DEBUG nova.compute.manager [req-a744c4f7-4e52-43c7-9e58-dc8644ed9901 req-442b3477-e3a0-4072-9d2c-2ed9f0d101df bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] No waiting events found dispatching network-vif-plugged-4df96566-2548-47bc-bd48-095ff9ce5a25 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 08 19:09:32 compute-0 nova_compute[117514]: 2025-10-08 19:09:32.350 2 WARNING nova.compute.manager [req-a744c4f7-4e52-43c7-9e58-dc8644ed9901 req-442b3477-e3a0-4072-9d2c-2ed9f0d101df bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Received unexpected event network-vif-plugged-4df96566-2548-47bc-bd48-095ff9ce5a25 for instance with vm_state deleted and task_state None.
Oct 08 19:09:34 compute-0 sshd-session[147133]: Failed password for root from 193.46.255.103 port 18000 ssh2
Oct 08 19:09:34 compute-0 unix_chkpwd[147199]: password check failed for user (root)
Oct 08 19:09:35 compute-0 nova_compute[117514]: 2025-10-08 19:09:35.010 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:35 compute-0 nova_compute[117514]: 2025-10-08 19:09:35.799 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:36 compute-0 nova_compute[117514]: 2025-10-08 19:09:36.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:36 compute-0 nova_compute[117514]: 2025-10-08 19:09:36.540 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:36 compute-0 sshd-session[147133]: Failed password for root from 193.46.255.103 port 18000 ssh2
Oct 08 19:09:37 compute-0 unix_chkpwd[147201]: password check failed for user (root)
Oct 08 19:09:39 compute-0 nova_compute[117514]: 2025-10-08 19:09:39.468 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759950564.466738, 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 08 19:09:39 compute-0 nova_compute[117514]: 2025-10-08 19:09:39.469 2 INFO nova.compute.manager [-] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] VM Stopped (Lifecycle Event)
Oct 08 19:09:39 compute-0 nova_compute[117514]: 2025-10-08 19:09:39.487 2 DEBUG nova.compute.manager [None req-8fd81e35-7b7d-4fc4-9e97-119efa4e6095 - - - - - -] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:09:39 compute-0 sshd-session[147133]: Failed password for root from 193.46.255.103 port 18000 ssh2
Oct 08 19:09:40 compute-0 nova_compute[117514]: 2025-10-08 19:09:40.014 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:40 compute-0 podman[147202]: 2025-10-08 19:09:40.645779945 +0000 UTC m=+0.061044233 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 08 19:09:40 compute-0 nova_compute[117514]: 2025-10-08 19:09:40.801 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:41 compute-0 sshd-session[147133]: Received disconnect from 193.46.255.103 port 18000:11:  [preauth]
Oct 08 19:09:41 compute-0 sshd-session[147133]: Disconnected from authenticating user root 193.46.255.103 port 18000 [preauth]
Oct 08 19:09:41 compute-0 sshd-session[147133]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.103  user=root
Oct 08 19:09:42 compute-0 unix_chkpwd[147228]: password check failed for user (root)
Oct 08 19:09:42 compute-0 sshd-session[147226]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.103  user=root
Oct 08 19:09:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:44.230 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:09:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:44.231 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:09:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:44.231 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:09:44 compute-0 sshd-session[147226]: Failed password for root from 193.46.255.103 port 59642 ssh2
Oct 08 19:09:44 compute-0 nova_compute[117514]: 2025-10-08 19:09:44.982 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759950569.9802787, b81092db-79a9-4570-9579-4e100364515a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 08 19:09:44 compute-0 nova_compute[117514]: 2025-10-08 19:09:44.983 2 INFO nova.compute.manager [-] [instance: b81092db-79a9-4570-9579-4e100364515a] VM Stopped (Lifecycle Event)
Oct 08 19:09:45 compute-0 nova_compute[117514]: 2025-10-08 19:09:45.017 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:45 compute-0 nova_compute[117514]: 2025-10-08 19:09:45.038 2 DEBUG nova.compute.manager [None req-8e2b379e-2fb5-4935-9bae-f249390812b2 - - - - - -] [instance: b81092db-79a9-4570-9579-4e100364515a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:09:45 compute-0 unix_chkpwd[147229]: password check failed for user (root)
Oct 08 19:09:45 compute-0 nova_compute[117514]: 2025-10-08 19:09:45.858 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:47 compute-0 sshd-session[147226]: Failed password for root from 193.46.255.103 port 59642 ssh2
Oct 08 19:09:47 compute-0 unix_chkpwd[147230]: password check failed for user (root)
Oct 08 19:09:49 compute-0 nova_compute[117514]: 2025-10-08 19:09:49.913 2 DEBUG oslo_concurrency.lockutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:09:49 compute-0 nova_compute[117514]: 2025-10-08 19:09:49.914 2 DEBUG oslo_concurrency.lockutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:09:49 compute-0 nova_compute[117514]: 2025-10-08 19:09:49.929 2 DEBUG nova.compute.manager [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Oct 08 19:09:49 compute-0 sshd-session[147226]: Failed password for root from 193.46.255.103 port 59642 ssh2
Oct 08 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.008 2 DEBUG oslo_concurrency.lockutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.009 2 DEBUG oslo_concurrency.lockutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.020 2 DEBUG nova.virt.hardware [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Oct 08 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.021 2 INFO nova.compute.claims [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Claim successful on node compute-0.ctlplane.example.com
Oct 08 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.122 2 DEBUG nova.compute.provider_tree [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 08 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.138 2 DEBUG nova.scheduler.client.report [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 08 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.160 2 DEBUG oslo_concurrency.lockutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.161 2 DEBUG nova.compute.manager [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Oct 08 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.212 2 DEBUG nova.compute.manager [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Oct 08 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.213 2 DEBUG nova.network.neutron [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Oct 08 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.229 2 INFO nova.virt.libvirt.driver [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 08 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.243 2 DEBUG nova.compute.manager [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Oct 08 19:09:50 compute-0 sshd-session[147226]: Received disconnect from 193.46.255.103 port 59642:11:  [preauth]
Oct 08 19:09:50 compute-0 sshd-session[147226]: Disconnected from authenticating user root 193.46.255.103 port 59642 [preauth]
Oct 08 19:09:50 compute-0 sshd-session[147226]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.103  user=root
Oct 08 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.329 2 DEBUG nova.compute.manager [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Oct 08 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.331 2 DEBUG nova.virt.libvirt.driver [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Oct 08 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.332 2 INFO nova.virt.libvirt.driver [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Creating image(s)
Oct 08 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.333 2 DEBUG oslo_concurrency.lockutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "/var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.333 2 DEBUG oslo_concurrency.lockutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "/var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.335 2 DEBUG oslo_concurrency.lockutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "/var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.360 2 DEBUG oslo_concurrency.processutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.447 2 DEBUG oslo_concurrency.processutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.448 2 DEBUG oslo_concurrency.lockutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "008eb3078b811ee47058b7252a820910c35fc6df" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.449 2 DEBUG oslo_concurrency.lockutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.466 2 DEBUG oslo_concurrency.processutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.542 2 DEBUG oslo_concurrency.processutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.543 2 DEBUG oslo_concurrency.processutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df,backing_fmt=raw /var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.640 2 DEBUG oslo_concurrency.processutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df,backing_fmt=raw /var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk 1073741824" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.642 2 DEBUG oslo_concurrency.lockutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.192s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.643 2 DEBUG oslo_concurrency.processutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:09:50 compute-0 podman[147240]: 2025-10-08 19:09:50.677693593 +0000 UTC m=+0.104230197 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct 08 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.732 2 DEBUG oslo_concurrency.processutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.733 2 DEBUG nova.virt.disk.api [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Checking if we can resize image /var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Oct 08 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.733 2 DEBUG oslo_concurrency.processutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.786 2 DEBUG oslo_concurrency.processutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.788 2 DEBUG nova.virt.disk.api [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Cannot resize image /var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Oct 08 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.788 2 DEBUG nova.objects.instance [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'migration_context' on Instance uuid 783f8889-2bc8-4641-bdb9-95ee4226a2fd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 08 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.806 2 DEBUG nova.virt.libvirt.driver [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Oct 08 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.806 2 DEBUG nova.virt.libvirt.driver [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Ensure instance console log exists: /var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Oct 08 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.807 2 DEBUG oslo_concurrency.lockutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.808 2 DEBUG oslo_concurrency.lockutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.808 2 DEBUG oslo_concurrency.lockutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.920 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:51 compute-0 nova_compute[117514]: 2025-10-08 19:09:51.117 2 DEBUG nova.policy [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Oct 08 19:09:51 compute-0 unix_chkpwd[147268]: password check failed for user (root)
Oct 08 19:09:51 compute-0 sshd-session[147232]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.103  user=root
Oct 08 19:09:52 compute-0 sshd-session[147232]: Failed password for root from 193.46.255.103 port 54442 ssh2
Oct 08 19:09:53 compute-0 nova_compute[117514]: 2025-10-08 19:09:53.187 2 DEBUG nova.network.neutron [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Successfully created port: bfb32e9e-52b6-4043-b9a6-129d11fa2814 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Oct 08 19:09:53 compute-0 unix_chkpwd[147269]: password check failed for user (root)
Oct 08 19:09:55 compute-0 nova_compute[117514]: 2025-10-08 19:09:55.028 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:55 compute-0 nova_compute[117514]: 2025-10-08 19:09:55.096 2 DEBUG nova.network.neutron [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Successfully updated port: bfb32e9e-52b6-4043-b9a6-129d11fa2814 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Oct 08 19:09:55 compute-0 nova_compute[117514]: 2025-10-08 19:09:55.126 2 DEBUG oslo_concurrency.lockutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "refresh_cache-783f8889-2bc8-4641-bdb9-95ee4226a2fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:09:55 compute-0 nova_compute[117514]: 2025-10-08 19:09:55.126 2 DEBUG oslo_concurrency.lockutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquired lock "refresh_cache-783f8889-2bc8-4641-bdb9-95ee4226a2fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:09:55 compute-0 nova_compute[117514]: 2025-10-08 19:09:55.126 2 DEBUG nova.network.neutron [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 08 19:09:55 compute-0 nova_compute[117514]: 2025-10-08 19:09:55.211 2 DEBUG nova.compute.manager [req-dea58d1a-1d5c-4e74-962e-f87c0ba7c9fc req-449fc039-0678-4d47-b3f3-84001da0719d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Received event network-changed-bfb32e9e-52b6-4043-b9a6-129d11fa2814 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:09:55 compute-0 nova_compute[117514]: 2025-10-08 19:09:55.212 2 DEBUG nova.compute.manager [req-dea58d1a-1d5c-4e74-962e-f87c0ba7c9fc req-449fc039-0678-4d47-b3f3-84001da0719d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Refreshing instance network info cache due to event network-changed-bfb32e9e-52b6-4043-b9a6-129d11fa2814. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 08 19:09:55 compute-0 nova_compute[117514]: 2025-10-08 19:09:55.212 2 DEBUG oslo_concurrency.lockutils [req-dea58d1a-1d5c-4e74-962e-f87c0ba7c9fc req-449fc039-0678-4d47-b3f3-84001da0719d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-783f8889-2bc8-4641-bdb9-95ee4226a2fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:09:55 compute-0 nova_compute[117514]: 2025-10-08 19:09:55.274 2 DEBUG nova.network.neutron [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Oct 08 19:09:55 compute-0 sshd-session[147232]: Failed password for root from 193.46.255.103 port 54442 ssh2
Oct 08 19:09:55 compute-0 nova_compute[117514]: 2025-10-08 19:09:55.921 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:56 compute-0 unix_chkpwd[147270]: password check failed for user (root)
Oct 08 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.387 2 DEBUG nova.network.neutron [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Updating instance_info_cache with network_info: [{"id": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "address": "fa:16:3e:4e:85:2e", "network": {"id": "0d073e98-c9f2-4b90-8237-84ff2fa99090", "bridge": "br-int", "label": "tempest-network-smoke--1785011615", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb32e9e-52", "ovs_interfaceid": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.408 2 DEBUG oslo_concurrency.lockutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Releasing lock "refresh_cache-783f8889-2bc8-4641-bdb9-95ee4226a2fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.409 2 DEBUG nova.compute.manager [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Instance network_info: |[{"id": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "address": "fa:16:3e:4e:85:2e", "network": {"id": "0d073e98-c9f2-4b90-8237-84ff2fa99090", "bridge": "br-int", "label": "tempest-network-smoke--1785011615", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb32e9e-52", "ovs_interfaceid": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Oct 08 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.410 2 DEBUG oslo_concurrency.lockutils [req-dea58d1a-1d5c-4e74-962e-f87c0ba7c9fc req-449fc039-0678-4d47-b3f3-84001da0719d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-783f8889-2bc8-4641-bdb9-95ee4226a2fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.410 2 DEBUG nova.network.neutron [req-dea58d1a-1d5c-4e74-962e-f87c0ba7c9fc req-449fc039-0678-4d47-b3f3-84001da0719d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Refreshing network info cache for port bfb32e9e-52b6-4043-b9a6-129d11fa2814 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 08 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.416 2 DEBUG nova.virt.libvirt.driver [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Start _get_guest_xml network_info=[{"id": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "address": "fa:16:3e:4e:85:2e", "network": {"id": "0d073e98-c9f2-4b90-8237-84ff2fa99090", "bridge": "br-int", "label": "tempest-network-smoke--1785011615", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb32e9e-52", "ovs_interfaceid": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T19:05:11Z,direct_url=<?>,disk_format='qcow2',id=23cfa426-7011-4566-992d-1c7af39f70dd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0776a2a010754884a7b224f3b08ef53b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T19:05:12Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'guest_format': None, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_options': None, 'image_id': '23cfa426-7011-4566-992d-1c7af39f70dd'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Oct 08 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.425 2 WARNING nova.virt.libvirt.driver [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 08 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.437 2 DEBUG nova.virt.libvirt.host [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Oct 08 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.438 2 DEBUG nova.virt.libvirt.host [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Oct 08 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.442 2 DEBUG nova.virt.libvirt.host [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Oct 08 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.443 2 DEBUG nova.virt.libvirt.host [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Oct 08 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.443 2 DEBUG nova.virt.libvirt.driver [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Oct 08 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.444 2 DEBUG nova.virt.hardware [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T19:05:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='e8a148fc-4419-4813-98ff-a17e2a95609e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T19:05:11Z,direct_url=<?>,disk_format='qcow2',id=23cfa426-7011-4566-992d-1c7af39f70dd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0776a2a010754884a7b224f3b08ef53b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T19:05:12Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Oct 08 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.445 2 DEBUG nova.virt.hardware [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Oct 08 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.445 2 DEBUG nova.virt.hardware [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Oct 08 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.446 2 DEBUG nova.virt.hardware [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Oct 08 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.446 2 DEBUG nova.virt.hardware [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Oct 08 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.446 2 DEBUG nova.virt.hardware [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Oct 08 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.447 2 DEBUG nova.virt.hardware [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Oct 08 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.447 2 DEBUG nova.virt.hardware [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Oct 08 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.448 2 DEBUG nova.virt.hardware [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Oct 08 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.448 2 DEBUG nova.virt.hardware [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Oct 08 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.449 2 DEBUG nova.virt.hardware [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Oct 08 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.455 2 DEBUG nova.virt.libvirt.vif [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T19:09:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1641480242',display_name='tempest-TestNetworkBasicOps-server-1641480242',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1641480242',id=6,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPSyEE+QeB2DOtd7xoaY+J9mVl+DzPE43UDhso7eEGO9aQXs3wmj/YcqHfJ97lRUVFOa3dbwNiIUyunSI3DyzjQf/v6cjCZ2KkxRD0GJnQ0zRM5omnXaZRnz3Bq5VONa9g==',key_name='tempest-TestNetworkBasicOps-1535027603',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-pbt1zket',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T19:09:50Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=783f8889-2bc8-4641-bdb9-95ee4226a2fd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "address": "fa:16:3e:4e:85:2e", "network": {"id": "0d073e98-c9f2-4b90-8237-84ff2fa99090", "bridge": "br-int", "label": "tempest-network-smoke--1785011615", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb32e9e-52", "ovs_interfaceid": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Oct 08 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.456 2 DEBUG nova.network.os_vif_util [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "address": "fa:16:3e:4e:85:2e", "network": {"id": "0d073e98-c9f2-4b90-8237-84ff2fa99090", "bridge": "br-int", "label": "tempest-network-smoke--1785011615", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb32e9e-52", "ovs_interfaceid": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 08 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.457 2 DEBUG nova.network.os_vif_util [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4e:85:2e,bridge_name='br-int',has_traffic_filtering=True,id=bfb32e9e-52b6-4043-b9a6-129d11fa2814,network=Network(0d073e98-c9f2-4b90-8237-84ff2fa99090),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbfb32e9e-52') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 08 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.459 2 DEBUG nova.objects.instance [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 783f8889-2bc8-4641-bdb9-95ee4226a2fd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 08 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.478 2 DEBUG nova.virt.libvirt.driver [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] End _get_guest_xml xml=<domain type="kvm">
Oct 08 19:09:56 compute-0 nova_compute[117514]:   <uuid>783f8889-2bc8-4641-bdb9-95ee4226a2fd</uuid>
Oct 08 19:09:56 compute-0 nova_compute[117514]:   <name>instance-00000006</name>
Oct 08 19:09:56 compute-0 nova_compute[117514]:   <memory>131072</memory>
Oct 08 19:09:56 compute-0 nova_compute[117514]:   <vcpu>1</vcpu>
Oct 08 19:09:56 compute-0 nova_compute[117514]:   <metadata>
Oct 08 19:09:56 compute-0 nova_compute[117514]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 08 19:09:56 compute-0 nova_compute[117514]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct 08 19:09:56 compute-0 nova_compute[117514]:       <nova:name>tempest-TestNetworkBasicOps-server-1641480242</nova:name>
Oct 08 19:09:56 compute-0 nova_compute[117514]:       <nova:creationTime>2025-10-08 19:09:56</nova:creationTime>
Oct 08 19:09:56 compute-0 nova_compute[117514]:       <nova:flavor name="m1.nano">
Oct 08 19:09:56 compute-0 nova_compute[117514]:         <nova:memory>128</nova:memory>
Oct 08 19:09:56 compute-0 nova_compute[117514]:         <nova:disk>1</nova:disk>
Oct 08 19:09:56 compute-0 nova_compute[117514]:         <nova:swap>0</nova:swap>
Oct 08 19:09:56 compute-0 nova_compute[117514]:         <nova:ephemeral>0</nova:ephemeral>
Oct 08 19:09:56 compute-0 nova_compute[117514]:         <nova:vcpus>1</nova:vcpus>
Oct 08 19:09:56 compute-0 nova_compute[117514]:       </nova:flavor>
Oct 08 19:09:56 compute-0 nova_compute[117514]:       <nova:owner>
Oct 08 19:09:56 compute-0 nova_compute[117514]:         <nova:user uuid="efdb1424acdb478684cdb088b373ba05">tempest-TestNetworkBasicOps-1122149477-project-member</nova:user>
Oct 08 19:09:56 compute-0 nova_compute[117514]:         <nova:project uuid="b7f7c752a9c5498f8eda73e461895ac9">tempest-TestNetworkBasicOps-1122149477</nova:project>
Oct 08 19:09:56 compute-0 nova_compute[117514]:       </nova:owner>
Oct 08 19:09:56 compute-0 nova_compute[117514]:       <nova:root type="image" uuid="23cfa426-7011-4566-992d-1c7af39f70dd"/>
Oct 08 19:09:56 compute-0 nova_compute[117514]:       <nova:ports>
Oct 08 19:09:56 compute-0 nova_compute[117514]:         <nova:port uuid="bfb32e9e-52b6-4043-b9a6-129d11fa2814">
Oct 08 19:09:56 compute-0 nova_compute[117514]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Oct 08 19:09:56 compute-0 nova_compute[117514]:         </nova:port>
Oct 08 19:09:56 compute-0 nova_compute[117514]:       </nova:ports>
Oct 08 19:09:56 compute-0 nova_compute[117514]:     </nova:instance>
Oct 08 19:09:56 compute-0 nova_compute[117514]:   </metadata>
Oct 08 19:09:56 compute-0 nova_compute[117514]:   <sysinfo type="smbios">
Oct 08 19:09:56 compute-0 nova_compute[117514]:     <system>
Oct 08 19:09:56 compute-0 nova_compute[117514]:       <entry name="manufacturer">RDO</entry>
Oct 08 19:09:56 compute-0 nova_compute[117514]:       <entry name="product">OpenStack Compute</entry>
Oct 08 19:09:56 compute-0 nova_compute[117514]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct 08 19:09:56 compute-0 nova_compute[117514]:       <entry name="serial">783f8889-2bc8-4641-bdb9-95ee4226a2fd</entry>
Oct 08 19:09:56 compute-0 nova_compute[117514]:       <entry name="uuid">783f8889-2bc8-4641-bdb9-95ee4226a2fd</entry>
Oct 08 19:09:56 compute-0 nova_compute[117514]:       <entry name="family">Virtual Machine</entry>
Oct 08 19:09:56 compute-0 nova_compute[117514]:     </system>
Oct 08 19:09:56 compute-0 nova_compute[117514]:   </sysinfo>
Oct 08 19:09:56 compute-0 nova_compute[117514]:   <os>
Oct 08 19:09:56 compute-0 nova_compute[117514]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 08 19:09:56 compute-0 nova_compute[117514]:     <boot dev="hd"/>
Oct 08 19:09:56 compute-0 nova_compute[117514]:     <smbios mode="sysinfo"/>
Oct 08 19:09:56 compute-0 nova_compute[117514]:   </os>
Oct 08 19:09:56 compute-0 nova_compute[117514]:   <features>
Oct 08 19:09:56 compute-0 nova_compute[117514]:     <acpi/>
Oct 08 19:09:56 compute-0 nova_compute[117514]:     <apic/>
Oct 08 19:09:56 compute-0 nova_compute[117514]:     <vmcoreinfo/>
Oct 08 19:09:56 compute-0 nova_compute[117514]:   </features>
Oct 08 19:09:56 compute-0 nova_compute[117514]:   <clock offset="utc">
Oct 08 19:09:56 compute-0 nova_compute[117514]:     <timer name="pit" tickpolicy="delay"/>
Oct 08 19:09:56 compute-0 nova_compute[117514]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 08 19:09:56 compute-0 nova_compute[117514]:     <timer name="hpet" present="no"/>
Oct 08 19:09:56 compute-0 nova_compute[117514]:   </clock>
Oct 08 19:09:56 compute-0 nova_compute[117514]:   <cpu mode="host-model" match="exact">
Oct 08 19:09:56 compute-0 nova_compute[117514]:     <topology sockets="1" cores="1" threads="1"/>
Oct 08 19:09:56 compute-0 nova_compute[117514]:   </cpu>
Oct 08 19:09:56 compute-0 nova_compute[117514]:   <devices>
Oct 08 19:09:56 compute-0 nova_compute[117514]:     <disk type="file" device="disk">
Oct 08 19:09:56 compute-0 nova_compute[117514]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 08 19:09:56 compute-0 nova_compute[117514]:       <source file="/var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk"/>
Oct 08 19:09:56 compute-0 nova_compute[117514]:       <target dev="vda" bus="virtio"/>
Oct 08 19:09:56 compute-0 nova_compute[117514]:     </disk>
Oct 08 19:09:56 compute-0 nova_compute[117514]:     <disk type="file" device="cdrom">
Oct 08 19:09:56 compute-0 nova_compute[117514]:       <driver name="qemu" type="raw" cache="none"/>
Oct 08 19:09:56 compute-0 nova_compute[117514]:       <source file="/var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk.config"/>
Oct 08 19:09:56 compute-0 nova_compute[117514]:       <target dev="sda" bus="sata"/>
Oct 08 19:09:56 compute-0 nova_compute[117514]:     </disk>
Oct 08 19:09:56 compute-0 nova_compute[117514]:     <interface type="ethernet">
Oct 08 19:09:56 compute-0 nova_compute[117514]:       <mac address="fa:16:3e:4e:85:2e"/>
Oct 08 19:09:56 compute-0 nova_compute[117514]:       <model type="virtio"/>
Oct 08 19:09:56 compute-0 nova_compute[117514]:       <driver name="vhost" rx_queue_size="512"/>
Oct 08 19:09:56 compute-0 nova_compute[117514]:       <mtu size="1442"/>
Oct 08 19:09:56 compute-0 nova_compute[117514]:       <target dev="tapbfb32e9e-52"/>
Oct 08 19:09:56 compute-0 nova_compute[117514]:     </interface>
Oct 08 19:09:56 compute-0 nova_compute[117514]:     <serial type="pty">
Oct 08 19:09:56 compute-0 nova_compute[117514]:       <log file="/var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/console.log" append="off"/>
Oct 08 19:09:56 compute-0 nova_compute[117514]:     </serial>
Oct 08 19:09:56 compute-0 nova_compute[117514]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 08 19:09:56 compute-0 nova_compute[117514]:     <video>
Oct 08 19:09:56 compute-0 nova_compute[117514]:       <model type="virtio"/>
Oct 08 19:09:56 compute-0 nova_compute[117514]:     </video>
Oct 08 19:09:56 compute-0 nova_compute[117514]:     <input type="tablet" bus="usb"/>
Oct 08 19:09:56 compute-0 nova_compute[117514]:     <rng model="virtio">
Oct 08 19:09:56 compute-0 nova_compute[117514]:       <backend model="random">/dev/urandom</backend>
Oct 08 19:09:56 compute-0 nova_compute[117514]:     </rng>
Oct 08 19:09:56 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root"/>
Oct 08 19:09:56 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:09:56 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:09:56 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:09:56 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:09:56 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:09:56 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:09:56 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:09:56 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:09:56 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:09:56 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:09:56 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:09:56 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:09:56 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:09:56 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:09:56 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:09:56 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:09:56 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:09:56 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:09:56 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:09:56 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:09:56 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:09:56 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:09:56 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:09:56 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:09:56 compute-0 nova_compute[117514]:     <controller type="usb" index="0"/>
Oct 08 19:09:56 compute-0 nova_compute[117514]:     <memballoon model="virtio">
Oct 08 19:09:56 compute-0 nova_compute[117514]:       <stats period="10"/>
Oct 08 19:09:56 compute-0 nova_compute[117514]:     </memballoon>
Oct 08 19:09:56 compute-0 nova_compute[117514]:   </devices>
Oct 08 19:09:56 compute-0 nova_compute[117514]: </domain>
Oct 08 19:09:56 compute-0 nova_compute[117514]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Oct 08 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.479 2 DEBUG nova.compute.manager [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Preparing to wait for external event network-vif-plugged-bfb32e9e-52b6-4043-b9a6-129d11fa2814 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Oct 08 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.480 2 DEBUG oslo_concurrency.lockutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.480 2 DEBUG oslo_concurrency.lockutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.481 2 DEBUG oslo_concurrency.lockutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.481 2 DEBUG nova.virt.libvirt.vif [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T19:09:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1641480242',display_name='tempest-TestNetworkBasicOps-server-1641480242',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1641480242',id=6,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPSyEE+QeB2DOtd7xoaY+J9mVl+DzPE43UDhso7eEGO9aQXs3wmj/YcqHfJ97lRUVFOa3dbwNiIUyunSI3DyzjQf/v6cjCZ2KkxRD0GJnQ0zRM5omnXaZRnz3Bq5VONa9g==',key_name='tempest-TestNetworkBasicOps-1535027603',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-pbt1zket',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T19:09:50Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=783f8889-2bc8-4641-bdb9-95ee4226a2fd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "address": "fa:16:3e:4e:85:2e", "network": {"id": "0d073e98-c9f2-4b90-8237-84ff2fa99090", "bridge": "br-int", "label": "tempest-network-smoke--1785011615", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb32e9e-52", "ovs_interfaceid": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Oct 08 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.481 2 DEBUG nova.network.os_vif_util [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "address": "fa:16:3e:4e:85:2e", "network": {"id": "0d073e98-c9f2-4b90-8237-84ff2fa99090", "bridge": "br-int", "label": "tempest-network-smoke--1785011615", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb32e9e-52", "ovs_interfaceid": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 08 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.482 2 DEBUG nova.network.os_vif_util [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4e:85:2e,bridge_name='br-int',has_traffic_filtering=True,id=bfb32e9e-52b6-4043-b9a6-129d11fa2814,network=Network(0d073e98-c9f2-4b90-8237-84ff2fa99090),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbfb32e9e-52') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 08 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.483 2 DEBUG os_vif [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:85:2e,bridge_name='br-int',has_traffic_filtering=True,id=bfb32e9e-52b6-4043-b9a6-129d11fa2814,network=Network(0d073e98-c9f2-4b90-8237-84ff2fa99090),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbfb32e9e-52') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Oct 08 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.483 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.484 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 08 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.489 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbfb32e9e-52, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.489 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbfb32e9e-52, col_values=(('external_ids', {'iface-id': 'bfb32e9e-52b6-4043-b9a6-129d11fa2814', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4e:85:2e', 'vm-uuid': '783f8889-2bc8-4641-bdb9-95ee4226a2fd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.491 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:56 compute-0 NetworkManager[1035]: <info>  [1759950596.4930] manager: (tapbfb32e9e-52): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/51)
Oct 08 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 08 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.505 2 INFO os_vif [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:85:2e,bridge_name='br-int',has_traffic_filtering=True,id=bfb32e9e-52b6-4043-b9a6-129d11fa2814,network=Network(0d073e98-c9f2-4b90-8237-84ff2fa99090),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbfb32e9e-52')
Oct 08 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.566 2 DEBUG nova.virt.libvirt.driver [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 08 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.568 2 DEBUG nova.virt.libvirt.driver [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 08 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.568 2 DEBUG nova.virt.libvirt.driver [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No VIF found with MAC fa:16:3e:4e:85:2e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Oct 08 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.568 2 INFO nova.virt.libvirt.driver [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Using config drive
Oct 08 19:09:56 compute-0 podman[147275]: 2025-10-08 19:09:56.611698285 +0000 UTC m=+0.066953108 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct 08 19:09:56 compute-0 podman[147274]: 2025-10-08 19:09:56.616710371 +0000 UTC m=+0.077639480 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_id=edpm, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.expose-services=, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct 08 19:09:57 compute-0 nova_compute[117514]: 2025-10-08 19:09:57.158 2 INFO nova.virt.libvirt.driver [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Creating config drive at /var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk.config
Oct 08 19:09:57 compute-0 nova_compute[117514]: 2025-10-08 19:09:57.164 2 DEBUG oslo_concurrency.processutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpb5eau1jo execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:09:57 compute-0 nova_compute[117514]: 2025-10-08 19:09:57.303 2 DEBUG oslo_concurrency.processutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpb5eau1jo" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:09:57 compute-0 kernel: tapbfb32e9e-52: entered promiscuous mode
Oct 08 19:09:57 compute-0 NetworkManager[1035]: <info>  [1759950597.3985] manager: (tapbfb32e9e-52): new Tun device (/org/freedesktop/NetworkManager/Devices/52)
Oct 08 19:09:57 compute-0 ovn_controller[19759]: 2025-10-08T19:09:57Z|00082|binding|INFO|Claiming lport bfb32e9e-52b6-4043-b9a6-129d11fa2814 for this chassis.
Oct 08 19:09:57 compute-0 ovn_controller[19759]: 2025-10-08T19:09:57Z|00083|binding|INFO|bfb32e9e-52b6-4043-b9a6-129d11fa2814: Claiming fa:16:3e:4e:85:2e 10.100.0.14
Oct 08 19:09:57 compute-0 nova_compute[117514]: 2025-10-08 19:09:57.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:57 compute-0 nova_compute[117514]: 2025-10-08 19:09:57.416 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.425 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4e:85:2e 10.100.0.14'], port_security=['fa:16:3e:4e:85:2e 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0d073e98-c9f2-4b90-8237-84ff2fa99090', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e1f96720-345d-4fd7-8b5f-d68f6fe81454', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cd3b59ed-5967-491c-a3b5-d0ba2b165b15, chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=bfb32e9e-52b6-4043-b9a6-129d11fa2814) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.427 28643 INFO neutron.agent.ovn.metadata.agent [-] Port bfb32e9e-52b6-4043-b9a6-129d11fa2814 in datapath 0d073e98-c9f2-4b90-8237-84ff2fa99090 bound to our chassis
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.428 28643 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0d073e98-c9f2-4b90-8237-84ff2fa99090
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.443 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[6cf3e7a6-0947-4d9a-9bfc-ed3f89d6cce8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.444 28643 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0d073e98-c1 in ovnmeta-0d073e98-c9f2-4b90-8237-84ff2fa99090 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.447 144726 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0d073e98-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.447 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[fff2d7f4-3567-4889-8cbb-5ed7d6631111]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.448 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[43498475-7187-4f00-bc30-e6dfa1a7e492]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:09:57 compute-0 systemd-machined[77568]: New machine qemu-6-instance-00000006.
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.463 28783 DEBUG oslo.privsep.daemon [-] privsep: reply[d1b90d76-492a-403b-84a1-427cbd5d293f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:09:57 compute-0 nova_compute[117514]: 2025-10-08 19:09:57.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:57 compute-0 ovn_controller[19759]: 2025-10-08T19:09:57Z|00084|binding|INFO|Setting lport bfb32e9e-52b6-4043-b9a6-129d11fa2814 ovn-installed in OVS
Oct 08 19:09:57 compute-0 ovn_controller[19759]: 2025-10-08T19:09:57Z|00085|binding|INFO|Setting lport bfb32e9e-52b6-4043-b9a6-129d11fa2814 up in Southbound
Oct 08 19:09:57 compute-0 systemd[1]: Started Virtual Machine qemu-6-instance-00000006.
Oct 08 19:09:57 compute-0 nova_compute[117514]: 2025-10-08 19:09:57.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.493 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[369134e0-e2aa-4a5a-bc05-d553d086ef15]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:09:57 compute-0 systemd-udevd[147333]: Network interface NamePolicy= disabled on kernel command line.
Oct 08 19:09:57 compute-0 NetworkManager[1035]: <info>  [1759950597.5129] device (tapbfb32e9e-52): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 08 19:09:57 compute-0 NetworkManager[1035]: <info>  [1759950597.5138] device (tapbfb32e9e-52): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.536 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[b2125bbd-4b27-4bfb-b1f8-9416ebb6a7a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:09:57 compute-0 systemd-udevd[147338]: Network interface NamePolicy= disabled on kernel command line.
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.542 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[3ce90c26-a586-4dd6-b703-bdc77f042301]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:09:57 compute-0 NetworkManager[1035]: <info>  [1759950597.5440] manager: (tap0d073e98-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/53)
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.589 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[a8dc1c17-3027-47cd-a63e-86046e3d4455]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.595 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[28764abd-c22a-4f8c-b4f2-f87c1ce6a0e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:09:57 compute-0 NetworkManager[1035]: <info>  [1759950597.6280] device (tap0d073e98-c0): carrier: link connected
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.632 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[b3fbb484-5d1f-4196-b6df-bd35824ca717]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.656 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[c96e4d6c-cb1d-453a-8b51-4f25d2909a73]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0d073e98-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:56:43'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 34], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 126820, 'reachable_time': 32249, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 147363, 'error': None, 'target': 'ovnmeta-0d073e98-c9f2-4b90-8237-84ff2fa99090', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.680 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[fb644dc3-1fd1-4770-8ebe-8d135148bd1d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe36:5643'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 126820, 'tstamp': 126820}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 147364, 'error': None, 'target': 'ovnmeta-0d073e98-c9f2-4b90-8237-84ff2fa99090', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:09:57 compute-0 nova_compute[117514]: 2025-10-08 19:09:57.697 2 DEBUG nova.network.neutron [req-dea58d1a-1d5c-4e74-962e-f87c0ba7c9fc req-449fc039-0678-4d47-b3f3-84001da0719d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Updated VIF entry in instance network info cache for port bfb32e9e-52b6-4043-b9a6-129d11fa2814. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 08 19:09:57 compute-0 nova_compute[117514]: 2025-10-08 19:09:57.698 2 DEBUG nova.network.neutron [req-dea58d1a-1d5c-4e74-962e-f87c0ba7c9fc req-449fc039-0678-4d47-b3f3-84001da0719d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Updating instance_info_cache with network_info: [{"id": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "address": "fa:16:3e:4e:85:2e", "network": {"id": "0d073e98-c9f2-4b90-8237-84ff2fa99090", "bridge": "br-int", "label": "tempest-network-smoke--1785011615", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb32e9e-52", "ovs_interfaceid": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.703 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[c401d904-f04b-44fa-b6ea-56f4e69362f2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0d073e98-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:56:43'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 34], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 126820, 'reachable_time': 32249, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 147365, 'error': None, 'target': 'ovnmeta-0d073e98-c9f2-4b90-8237-84ff2fa99090', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:09:57 compute-0 nova_compute[117514]: 2025-10-08 19:09:57.714 2 DEBUG oslo_concurrency.lockutils [req-dea58d1a-1d5c-4e74-962e-f87c0ba7c9fc req-449fc039-0678-4d47-b3f3-84001da0719d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-783f8889-2bc8-4641-bdb9-95ee4226a2fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:09:57 compute-0 nova_compute[117514]: 2025-10-08 19:09:57.718 2 DEBUG nova.compute.manager [req-65dcc135-4bee-4397-9880-e66e2f48a2ed req-5e19914d-c671-48e9-bd39-96df0317b38f bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Received event network-vif-plugged-bfb32e9e-52b6-4043-b9a6-129d11fa2814 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:09:57 compute-0 nova_compute[117514]: 2025-10-08 19:09:57.719 2 DEBUG oslo_concurrency.lockutils [req-65dcc135-4bee-4397-9880-e66e2f48a2ed req-5e19914d-c671-48e9-bd39-96df0317b38f bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:09:57 compute-0 nova_compute[117514]: 2025-10-08 19:09:57.720 2 DEBUG oslo_concurrency.lockutils [req-65dcc135-4bee-4397-9880-e66e2f48a2ed req-5e19914d-c671-48e9-bd39-96df0317b38f bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:09:57 compute-0 nova_compute[117514]: 2025-10-08 19:09:57.720 2 DEBUG oslo_concurrency.lockutils [req-65dcc135-4bee-4397-9880-e66e2f48a2ed req-5e19914d-c671-48e9-bd39-96df0317b38f bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:09:57 compute-0 nova_compute[117514]: 2025-10-08 19:09:57.721 2 DEBUG nova.compute.manager [req-65dcc135-4bee-4397-9880-e66e2f48a2ed req-5e19914d-c671-48e9-bd39-96df0317b38f bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Processing event network-vif-plugged-bfb32e9e-52b6-4043-b9a6-129d11fa2814 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.747 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[20e0fa64-f842-4e85-8e41-b607b34e325a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.825 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[cb468035-7751-4b32-8102-53e4f19c23d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.827 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0d073e98-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.827 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.828 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0d073e98-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:09:57 compute-0 kernel: tap0d073e98-c0: entered promiscuous mode
Oct 08 19:09:57 compute-0 NetworkManager[1035]: <info>  [1759950597.8312] manager: (tap0d073e98-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/54)
Oct 08 19:09:57 compute-0 nova_compute[117514]: 2025-10-08 19:09:57.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.835 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0d073e98-c0, col_values=(('external_ids', {'iface-id': 'ef1b5170-2d11-4e01-98e4-310f59c22ecd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:09:57 compute-0 ovn_controller[19759]: 2025-10-08T19:09:57Z|00086|binding|INFO|Releasing lport ef1b5170-2d11-4e01-98e4-310f59c22ecd from this chassis (sb_readonly=0)
Oct 08 19:09:57 compute-0 nova_compute[117514]: 2025-10-08 19:09:57.837 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.838 28643 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0d073e98-c9f2-4b90-8237-84ff2fa99090.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0d073e98-c9f2-4b90-8237-84ff2fa99090.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.839 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[3aae28d3-ff3a-49e5-ab1b-2b15925d6ad8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.840 28643 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]: global
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]:     log         /dev/log local0 debug
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]:     log-tag     haproxy-metadata-proxy-0d073e98-c9f2-4b90-8237-84ff2fa99090
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]:     user        root
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]:     group       root
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]:     maxconn     1024
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]:     pidfile     /var/lib/neutron/external/pids/0d073e98-c9f2-4b90-8237-84ff2fa99090.pid.haproxy
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]:     daemon
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]: 
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]: defaults
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]:     log global
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]:     mode http
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]:     option httplog
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]:     option dontlognull
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]:     option http-server-close
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]:     option forwardfor
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]:     retries                 3
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]:     timeout http-request    30s
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]:     timeout connect         30s
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]:     timeout client          32s
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]:     timeout server          32s
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]:     timeout http-keep-alive 30s
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]: 
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]: 
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]: listen listener
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]:     bind 169.254.169.254:80
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]:     server metadata /var/lib/neutron/metadata_proxy
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]:     http-request add-header X-OVN-Network-ID 0d073e98-c9f2-4b90-8237-84ff2fa99090
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Oct 08 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.841 28643 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0d073e98-c9f2-4b90-8237-84ff2fa99090', 'env', 'PROCESS_TAG=haproxy-0d073e98-c9f2-4b90-8237-84ff2fa99090', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0d073e98-c9f2-4b90-8237-84ff2fa99090.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Oct 08 19:09:57 compute-0 nova_compute[117514]: 2025-10-08 19:09:57.847 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:09:58 compute-0 sshd-session[147232]: Failed password for root from 193.46.255.103 port 54442 ssh2
Oct 08 19:09:58 compute-0 podman[147404]: 2025-10-08 19:09:58.308804596 +0000 UTC m=+0.060509610 container create a10c865d8f33b87218a52fa6d5b32a88534c0d9b9c0a166b926f6b4b8ef386fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-0d073e98-c9f2-4b90-8237-84ff2fa99090, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Oct 08 19:09:58 compute-0 systemd[1]: Started libpod-conmon-a10c865d8f33b87218a52fa6d5b32a88534c0d9b9c0a166b926f6b4b8ef386fc.scope.
Oct 08 19:09:58 compute-0 nova_compute[117514]: 2025-10-08 19:09:58.344 2 DEBUG nova.compute.manager [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Oct 08 19:09:58 compute-0 systemd[1]: Started libcrun container.
Oct 08 19:09:58 compute-0 nova_compute[117514]: 2025-10-08 19:09:58.347 2 DEBUG nova.virt.libvirt.driver [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Oct 08 19:09:58 compute-0 nova_compute[117514]: 2025-10-08 19:09:58.348 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950598.3480842, 783f8889-2bc8-4641-bdb9-95ee4226a2fd => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 08 19:09:58 compute-0 nova_compute[117514]: 2025-10-08 19:09:58.348 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] VM Started (Lifecycle Event)
Oct 08 19:09:58 compute-0 nova_compute[117514]: 2025-10-08 19:09:58.352 2 INFO nova.virt.libvirt.driver [-] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Instance spawned successfully.
Oct 08 19:09:58 compute-0 nova_compute[117514]: 2025-10-08 19:09:58.353 2 DEBUG nova.virt.libvirt.driver [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Oct 08 19:09:58 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a1feba6530d5a46f34a3cb37ffae2c111c4760047000322a985a4db99d10005/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 08 19:09:58 compute-0 podman[147404]: 2025-10-08 19:09:58.273431132 +0000 UTC m=+0.025136196 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7
Oct 08 19:09:58 compute-0 podman[147404]: 2025-10-08 19:09:58.370369475 +0000 UTC m=+0.122074499 container init a10c865d8f33b87218a52fa6d5b32a88534c0d9b9c0a166b926f6b4b8ef386fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-0d073e98-c9f2-4b90-8237-84ff2fa99090, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct 08 19:09:58 compute-0 podman[147404]: 2025-10-08 19:09:58.376654469 +0000 UTC m=+0.128359473 container start a10c865d8f33b87218a52fa6d5b32a88534c0d9b9c0a166b926f6b4b8ef386fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-0d073e98-c9f2-4b90-8237-84ff2fa99090, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct 08 19:09:58 compute-0 nova_compute[117514]: 2025-10-08 19:09:58.389 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:09:58 compute-0 neutron-haproxy-ovnmeta-0d073e98-c9f2-4b90-8237-84ff2fa99090[147419]: [NOTICE]   (147423) : New worker (147425) forked
Oct 08 19:09:58 compute-0 neutron-haproxy-ovnmeta-0d073e98-c9f2-4b90-8237-84ff2fa99090[147419]: [NOTICE]   (147423) : Loading success.
Oct 08 19:09:58 compute-0 nova_compute[117514]: 2025-10-08 19:09:58.398 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 08 19:09:58 compute-0 nova_compute[117514]: 2025-10-08 19:09:58.401 2 DEBUG nova.virt.libvirt.driver [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:09:58 compute-0 nova_compute[117514]: 2025-10-08 19:09:58.402 2 DEBUG nova.virt.libvirt.driver [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:09:58 compute-0 nova_compute[117514]: 2025-10-08 19:09:58.405 2 DEBUG nova.virt.libvirt.driver [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:09:58 compute-0 nova_compute[117514]: 2025-10-08 19:09:58.405 2 DEBUG nova.virt.libvirt.driver [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:09:58 compute-0 nova_compute[117514]: 2025-10-08 19:09:58.406 2 DEBUG nova.virt.libvirt.driver [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:09:58 compute-0 nova_compute[117514]: 2025-10-08 19:09:58.406 2 DEBUG nova.virt.libvirt.driver [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:09:58 compute-0 nova_compute[117514]: 2025-10-08 19:09:58.433 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 08 19:09:58 compute-0 nova_compute[117514]: 2025-10-08 19:09:58.434 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950598.348189, 783f8889-2bc8-4641-bdb9-95ee4226a2fd => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 08 19:09:58 compute-0 nova_compute[117514]: 2025-10-08 19:09:58.435 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] VM Paused (Lifecycle Event)
Oct 08 19:09:58 compute-0 nova_compute[117514]: 2025-10-08 19:09:58.472 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:09:58 compute-0 nova_compute[117514]: 2025-10-08 19:09:58.475 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950598.3487449, 783f8889-2bc8-4641-bdb9-95ee4226a2fd => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 08 19:09:58 compute-0 nova_compute[117514]: 2025-10-08 19:09:58.476 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] VM Resumed (Lifecycle Event)
Oct 08 19:09:58 compute-0 nova_compute[117514]: 2025-10-08 19:09:58.484 2 INFO nova.compute.manager [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Took 8.15 seconds to spawn the instance on the hypervisor.
Oct 08 19:09:58 compute-0 nova_compute[117514]: 2025-10-08 19:09:58.485 2 DEBUG nova.compute.manager [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:09:58 compute-0 nova_compute[117514]: 2025-10-08 19:09:58.496 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:09:58 compute-0 nova_compute[117514]: 2025-10-08 19:09:58.499 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 08 19:09:58 compute-0 nova_compute[117514]: 2025-10-08 19:09:58.527 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 08 19:09:58 compute-0 nova_compute[117514]: 2025-10-08 19:09:58.537 2 INFO nova.compute.manager [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Took 8.57 seconds to build instance.
Oct 08 19:09:58 compute-0 nova_compute[117514]: 2025-10-08 19:09:58.553 2 DEBUG oslo_concurrency.lockutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.638s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:09:58 compute-0 sshd-session[147232]: Received disconnect from 193.46.255.103 port 54442:11:  [preauth]
Oct 08 19:09:58 compute-0 sshd-session[147232]: Disconnected from authenticating user root 193.46.255.103 port 54442 [preauth]
Oct 08 19:09:58 compute-0 sshd-session[147232]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.103  user=root
Oct 08 19:09:59 compute-0 nova_compute[117514]: 2025-10-08 19:09:59.794 2 DEBUG nova.compute.manager [req-8d271516-cd18-4a8b-9d12-f76234311223 req-2b10a661-3433-43cd-8f25-b8e17a5096fc bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Received event network-vif-plugged-bfb32e9e-52b6-4043-b9a6-129d11fa2814 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:09:59 compute-0 nova_compute[117514]: 2025-10-08 19:09:59.795 2 DEBUG oslo_concurrency.lockutils [req-8d271516-cd18-4a8b-9d12-f76234311223 req-2b10a661-3433-43cd-8f25-b8e17a5096fc bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:09:59 compute-0 nova_compute[117514]: 2025-10-08 19:09:59.796 2 DEBUG oslo_concurrency.lockutils [req-8d271516-cd18-4a8b-9d12-f76234311223 req-2b10a661-3433-43cd-8f25-b8e17a5096fc bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:09:59 compute-0 nova_compute[117514]: 2025-10-08 19:09:59.796 2 DEBUG oslo_concurrency.lockutils [req-8d271516-cd18-4a8b-9d12-f76234311223 req-2b10a661-3433-43cd-8f25-b8e17a5096fc bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:09:59 compute-0 nova_compute[117514]: 2025-10-08 19:09:59.796 2 DEBUG nova.compute.manager [req-8d271516-cd18-4a8b-9d12-f76234311223 req-2b10a661-3433-43cd-8f25-b8e17a5096fc bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] No waiting events found dispatching network-vif-plugged-bfb32e9e-52b6-4043-b9a6-129d11fa2814 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 08 19:09:59 compute-0 nova_compute[117514]: 2025-10-08 19:09:59.797 2 WARNING nova.compute.manager [req-8d271516-cd18-4a8b-9d12-f76234311223 req-2b10a661-3433-43cd-8f25-b8e17a5096fc bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Received unexpected event network-vif-plugged-bfb32e9e-52b6-4043-b9a6-129d11fa2814 for instance with vm_state active and task_state None.
Oct 08 19:10:00 compute-0 ovn_controller[19759]: 2025-10-08T19:10:00Z|00087|binding|INFO|Releasing lport ef1b5170-2d11-4e01-98e4-310f59c22ecd from this chassis (sb_readonly=0)
Oct 08 19:10:00 compute-0 nova_compute[117514]: 2025-10-08 19:10:00.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:10:00 compute-0 NetworkManager[1035]: <info>  [1759950600.5431] manager: (patch-provnet-64c51c9c-a066-44c7-bc3d-9c8bcfc2a465-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/55)
Oct 08 19:10:00 compute-0 NetworkManager[1035]: <info>  [1759950600.5448] manager: (patch-br-int-to-provnet-64c51c9c-a066-44c7-bc3d-9c8bcfc2a465): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/56)
Oct 08 19:10:00 compute-0 nova_compute[117514]: 2025-10-08 19:10:00.602 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:10:00 compute-0 ovn_controller[19759]: 2025-10-08T19:10:00Z|00088|binding|INFO|Releasing lport ef1b5170-2d11-4e01-98e4-310f59c22ecd from this chassis (sb_readonly=0)
Oct 08 19:10:00 compute-0 nova_compute[117514]: 2025-10-08 19:10:00.606 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:10:00 compute-0 systemd[1]: Starting system activity accounting tool...
Oct 08 19:10:00 compute-0 systemd[1]: sysstat-collect.service: Deactivated successfully.
Oct 08 19:10:00 compute-0 systemd[1]: Finished system activity accounting tool.
Oct 08 19:10:00 compute-0 podman[147434]: 2025-10-08 19:10:00.72967979 +0000 UTC m=+0.103418974 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 08 19:10:00 compute-0 nova_compute[117514]: 2025-10-08 19:10:00.853 2 DEBUG nova.compute.manager [req-8fa52e10-6370-4413-a2c8-fa754f0e87fd req-b2690757-1189-4b87-a6f6-9a61de74ec3a bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Received event network-changed-bfb32e9e-52b6-4043-b9a6-129d11fa2814 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:10:00 compute-0 nova_compute[117514]: 2025-10-08 19:10:00.853 2 DEBUG nova.compute.manager [req-8fa52e10-6370-4413-a2c8-fa754f0e87fd req-b2690757-1189-4b87-a6f6-9a61de74ec3a bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Refreshing instance network info cache due to event network-changed-bfb32e9e-52b6-4043-b9a6-129d11fa2814. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 08 19:10:00 compute-0 nova_compute[117514]: 2025-10-08 19:10:00.854 2 DEBUG oslo_concurrency.lockutils [req-8fa52e10-6370-4413-a2c8-fa754f0e87fd req-b2690757-1189-4b87-a6f6-9a61de74ec3a bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-783f8889-2bc8-4641-bdb9-95ee4226a2fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:10:00 compute-0 nova_compute[117514]: 2025-10-08 19:10:00.854 2 DEBUG oslo_concurrency.lockutils [req-8fa52e10-6370-4413-a2c8-fa754f0e87fd req-b2690757-1189-4b87-a6f6-9a61de74ec3a bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-783f8889-2bc8-4641-bdb9-95ee4226a2fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:10:00 compute-0 nova_compute[117514]: 2025-10-08 19:10:00.854 2 DEBUG nova.network.neutron [req-8fa52e10-6370-4413-a2c8-fa754f0e87fd req-b2690757-1189-4b87-a6f6-9a61de74ec3a bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Refreshing network info cache for port bfb32e9e-52b6-4043-b9a6-129d11fa2814 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 08 19:10:00 compute-0 nova_compute[117514]: 2025-10-08 19:10:00.925 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:10:01 compute-0 nova_compute[117514]: 2025-10-08 19:10:01.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:10:01 compute-0 nova_compute[117514]: 2025-10-08 19:10:01.823 2 DEBUG nova.network.neutron [req-8fa52e10-6370-4413-a2c8-fa754f0e87fd req-b2690757-1189-4b87-a6f6-9a61de74ec3a bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Updated VIF entry in instance network info cache for port bfb32e9e-52b6-4043-b9a6-129d11fa2814. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 08 19:10:01 compute-0 nova_compute[117514]: 2025-10-08 19:10:01.824 2 DEBUG nova.network.neutron [req-8fa52e10-6370-4413-a2c8-fa754f0e87fd req-b2690757-1189-4b87-a6f6-9a61de74ec3a bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Updating instance_info_cache with network_info: [{"id": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "address": "fa:16:3e:4e:85:2e", "network": {"id": "0d073e98-c9f2-4b90-8237-84ff2fa99090", "bridge": "br-int", "label": "tempest-network-smoke--1785011615", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb32e9e-52", "ovs_interfaceid": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:10:01 compute-0 nova_compute[117514]: 2025-10-08 19:10:01.844 2 DEBUG oslo_concurrency.lockutils [req-8fa52e10-6370-4413-a2c8-fa754f0e87fd req-b2690757-1189-4b87-a6f6-9a61de74ec3a bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-783f8889-2bc8-4641-bdb9-95ee4226a2fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:10:02 compute-0 podman[147465]: 2025-10-08 19:10:02.670523365 +0000 UTC m=+0.075830518 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct 08 19:10:02 compute-0 podman[147463]: 2025-10-08 19:10:02.693827056 +0000 UTC m=+0.103180007 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 08 19:10:02 compute-0 podman[147464]: 2025-10-08 19:10:02.721777913 +0000 UTC m=+0.134537743 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 08 19:10:03 compute-0 nova_compute[117514]: 2025-10-08 19:10:03.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:10:05 compute-0 nova_compute[117514]: 2025-10-08 19:10:05.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:10:05 compute-0 nova_compute[117514]: 2025-10-08 19:10:05.927 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:10:06 compute-0 nova_compute[117514]: 2025-10-08 19:10:06.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:10:06 compute-0 nova_compute[117514]: 2025-10-08 19:10:06.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:10:06 compute-0 nova_compute[117514]: 2025-10-08 19:10:06.718 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:10:06 compute-0 nova_compute[117514]: 2025-10-08 19:10:06.747 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:10:06 compute-0 nova_compute[117514]: 2025-10-08 19:10:06.751 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:10:06 compute-0 nova_compute[117514]: 2025-10-08 19:10:06.751 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:10:06 compute-0 nova_compute[117514]: 2025-10-08 19:10:06.752 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 08 19:10:06 compute-0 nova_compute[117514]: 2025-10-08 19:10:06.843 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:10:06 compute-0 nova_compute[117514]: 2025-10-08 19:10:06.942 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk --force-share --output=json" returned: 0 in 0.099s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:10:06 compute-0 nova_compute[117514]: 2025-10-08 19:10:06.944 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:10:07 compute-0 nova_compute[117514]: 2025-10-08 19:10:07.013 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:10:07 compute-0 nova_compute[117514]: 2025-10-08 19:10:07.211 2 WARNING nova.virt.libvirt.driver [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 08 19:10:07 compute-0 nova_compute[117514]: 2025-10-08 19:10:07.213 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5939MB free_disk=73.41488647460938GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 08 19:10:07 compute-0 nova_compute[117514]: 2025-10-08 19:10:07.214 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:10:07 compute-0 nova_compute[117514]: 2025-10-08 19:10:07.214 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:10:07 compute-0 nova_compute[117514]: 2025-10-08 19:10:07.324 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Instance 783f8889-2bc8-4641-bdb9-95ee4226a2fd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 08 19:10:07 compute-0 nova_compute[117514]: 2025-10-08 19:10:07.325 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 08 19:10:07 compute-0 nova_compute[117514]: 2025-10-08 19:10:07.325 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 08 19:10:07 compute-0 nova_compute[117514]: 2025-10-08 19:10:07.371 2 DEBUG nova.compute.provider_tree [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 08 19:10:07 compute-0 nova_compute[117514]: 2025-10-08 19:10:07.386 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 08 19:10:07 compute-0 nova_compute[117514]: 2025-10-08 19:10:07.410 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 08 19:10:07 compute-0 nova_compute[117514]: 2025-10-08 19:10:07.410 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.196s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.245 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'name': 'tempest-TestNetworkBasicOps-server-1641480242', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000006', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'hostId': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.246 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.246 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.246 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1641480242>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1641480242>]
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.247 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.247 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.247 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1641480242>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1641480242>]
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.247 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.250 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 783f8889-2bc8-4641-bdb9-95ee4226a2fd / tapbfb32e9e-52 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.251 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '59ac8332-5284-4251-972f-1fcbc96b74e4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-00000006-783f8889-2bc8-4641-bdb9-95ee4226a2fd-tapbfb32e9e-52', 'timestamp': '2025-10-08T19:10:08.247946', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'tapbfb32e9e-52', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:85:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbfb32e9e-52'}, 'message_id': '67c690da-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.877164166, 'message_signature': '7e8d8a7d5331579e981382ddf35932b5ffe915addaad7ebe17b8049e1e1a3504'}]}, 'timestamp': '2025-10-08 19:10:08.252064', '_unique_id': 'acfb74cbd15a4b269c09aa037c2fed37'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.254 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.278 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.279 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2077fb84-44d7-40d3-9132-df4fa0810708', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd-vda', 'timestamp': '2025-10-08T19:10:08.255145', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'instance-00000006', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '67cac5ce-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.884361186, 'message_signature': '1968980c403dd572fd25753bd6a4a54710bfaabb54944dc62fbababfce694128'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd-sda', 'timestamp': '2025-10-08T19:10:08.255145', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'instance-00000006', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '67caeafe-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.884361186, 'message_signature': 'af5f05d7cd411568017f5d9e07bb47a4ccafb57cb3f11950e30501858689c0a0'}]}, 'timestamp': '2025-10-08 19:10:08.280483', '_unique_id': 'ef279cfca85c4e05aa790d0d45332b43'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.284 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.284 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.284 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1641480242>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1641480242>]
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.284 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.299 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.300 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f76c475f-9210-4a97-b3cc-7c09e8cc309f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd-vda', 'timestamp': '2025-10-08T19:10:08.284925', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'instance-00000006', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '67cdf802-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.914143417, 'message_signature': '8634c394ff11065505affffc95f8f57e2feca32876ed9fbd88db340811b8e052'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd-sda', 'timestamp': '2025-10-08T19:10:08.284925', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'instance-00000006', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '67ce0a40-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.914143417, 'message_signature': '73cea4d89a34139c517c37e24856f6f13c2024484abd57b532b474d3fd220810'}]}, 'timestamp': '2025-10-08 19:10:08.300956', '_unique_id': '6cf0648dcbfe449eac7c09609442e3e7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.303 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.303 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk.device.read.bytes volume: 23816192 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.304 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '262a7157-0ac5-4335-b6ca-df479a13f509', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23816192, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd-vda', 'timestamp': '2025-10-08T19:10:08.303521', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'instance-00000006', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '67ce8236-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.884361186, 'message_signature': '9d814652e1a5ecc1926e7f8771a18bdaa8c8d1f55f7584d8d0b929874d16cd5f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd-sda', 'timestamp': '2025-10-08T19:10:08.303521', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'instance-00000006', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '67ce9596-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.884361186, 'message_signature': 'bb9d5f669b9194c9ff30e9df6f5880fc647a2561154ac907d67685a70f600a9f'}]}, 'timestamp': '2025-10-08 19:10:08.304479', '_unique_id': 'da72b37b5a5048d3ae68957b8f4c8412'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.306 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.307 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.307 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1641480242>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1641480242>]
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.307 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.333 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/memory.usage volume: 40.3671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7f4b875f-fc7f-4e1c-8ea4-aeaff479318e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.3671875, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'timestamp': '2025-10-08T19:10:08.307579', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'instance-00000006', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '67d32a52-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.962898662, 'message_signature': '8b6f7c50256cdb929db0c75abf109a1c42c75f1cd8a913bbc53d1fcc9f9b6165'}]}, 'timestamp': '2025-10-08 19:10:08.334576', '_unique_id': 'b57501b5c1b64164ac9edbefab90a5b8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.337 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.337 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'efa12b42-d049-43d7-90cc-f64326835aaa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-00000006-783f8889-2bc8-4641-bdb9-95ee4226a2fd-tapbfb32e9e-52', 'timestamp': '2025-10-08T19:10:08.337655', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'tapbfb32e9e-52', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:85:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbfb32e9e-52'}, 'message_id': '67d3bb70-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.877164166, 'message_signature': 'ec5bd9fdc80d350cda331c190485666f6b02060aac7ec23f51a7bb83e3d1555b'}]}, 'timestamp': '2025-10-08 19:10:08.338259', '_unique_id': 'bed1bacc2cef4037bd2776cc26fda8cb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.340 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.341 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.341 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '15b22c9d-f89d-47e6-95f3-5c062d87b84d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd-vda', 'timestamp': '2025-10-08T19:10:08.341098', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'instance-00000006', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '67d44018-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.914143417, 'message_signature': 'fd25c88e92ce428a01db97bda9a2800260a9dd16c7d883802cea258d844ff4a3'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd-sda', 'timestamp': '2025-10-08T19:10:08.341098', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'instance-00000006', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '67d45468-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.914143417, 'message_signature': '766c62586d92252b91ba97818e10eb0520efea3478c669bacb79c63c3b134886'}]}, 'timestamp': '2025-10-08 19:10:08.342170', '_unique_id': '564e3dc3a9ea46f3a8ed172ba2dd2e58'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.344 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.344 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8c38427d-d1d6-4c54-aa4d-fd90439caf1e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-00000006-783f8889-2bc8-4641-bdb9-95ee4226a2fd-tapbfb32e9e-52', 'timestamp': '2025-10-08T19:10:08.344701', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'tapbfb32e9e-52', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:85:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbfb32e9e-52'}, 'message_id': '67d4cede-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.877164166, 'message_signature': '82aa521a4946745bbaf56e180e7b888555512fd5b78a29e3f1e2697a7ae78a79'}]}, 'timestamp': '2025-10-08 19:10:08.345305', '_unique_id': '10eddd7060ed4f4b891863c4de764885'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.347 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.347 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.348 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f3ec50a4-bb4f-473c-a412-56ab305e7361', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd-vda', 'timestamp': '2025-10-08T19:10:08.347764', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'instance-00000006', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '67d544e0-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.884361186, 'message_signature': 'be05585fe3909a7913092dbfea69b33f2a69f354c9e11cdf106beccbd99a5943'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd-sda', 'timestamp': '2025-10-08T19:10:08.347764', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'instance-00000006', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '67d55732-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.884361186, 'message_signature': '6d6da5ed6ad2c95cc0637593f364dacd30d873f25ac222632f1e08deba2acad7'}]}, 'timestamp': '2025-10-08 19:10:08.348760', '_unique_id': 'e9ab4fcc955446a89a71365cbea8bc17'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.351 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.351 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bd2eca84-88be-4e8b-b908-5a1dd10fe9bb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-00000006-783f8889-2bc8-4641-bdb9-95ee4226a2fd-tapbfb32e9e-52', 'timestamp': '2025-10-08T19:10:08.351323', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'tapbfb32e9e-52', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:85:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbfb32e9e-52'}, 'message_id': '67d5cdf2-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.877164166, 'message_signature': 'ae43ec7baf3a21b0a38bc95755108c5b670b5434a182e8285f108b602d8df2e1'}]}, 'timestamp': '2025-10-08 19:10:08.351828', '_unique_id': 'f8092e3b04864e098c01ae36f233d602'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.354 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.354 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f6104bb4-175a-493d-b877-0618a5874d7b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-00000006-783f8889-2bc8-4641-bdb9-95ee4226a2fd-tapbfb32e9e-52', 'timestamp': '2025-10-08T19:10:08.354269', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'tapbfb32e9e-52', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:85:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbfb32e9e-52'}, 'message_id': '67d6417e-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.877164166, 'message_signature': '9d0b519d2bec54f8e9b9698aa6d6a4ab7096d422fb9c4ce5f95f5360038b5d6d'}]}, 'timestamp': '2025-10-08 19:10:08.354817', '_unique_id': 'ee5ee5ec92e744218a955a4e829324ae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.357 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.357 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f131d7ba-bb5f-4657-a0f2-1c758e9cd286', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-00000006-783f8889-2bc8-4641-bdb9-95ee4226a2fd-tapbfb32e9e-52', 'timestamp': '2025-10-08T19:10:08.357545', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'tapbfb32e9e-52', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:85:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbfb32e9e-52'}, 'message_id': '67d6c5f4-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.877164166, 'message_signature': '02c467156e728730d7b36e857a556370ea63ac6343e654775b7c4c2b2cbdb5b3'}]}, 'timestamp': '2025-10-08 19:10:08.358203', '_unique_id': '056debf2c3aa4fd0956f9e32388c8254'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.362 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.363 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk.device.read.latency volume: 471630043 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.363 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk.device.read.latency volume: 2636697 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cda39725-6986-4cf1-a693-b1ee04724d13', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 471630043, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd-vda', 'timestamp': '2025-10-08T19:10:08.363268', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'instance-00000006', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '67d79f1a-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.884361186, 'message_signature': '3657ffc056deceba6ac420954edb7e16d797e858954b78f0508f17658e131d95'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2636697, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd-sda', 'timestamp': '2025-10-08T19:10:08.363268', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'instance-00000006', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '67d7aa64-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.884361186, 'message_signature': '7087526c63025fd99427088ce14526570ee6417316c7fa2f3e5ed6979058e4f6'}]}, 'timestamp': '2025-10-08 19:10:08.363933', '_unique_id': '914f21d9da714ec185d93af47ac6731c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.365 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.365 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0c01d58e-8a10-4309-9853-13f19b96fb19', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-00000006-783f8889-2bc8-4641-bdb9-95ee4226a2fd-tapbfb32e9e-52', 'timestamp': '2025-10-08T19:10:08.365588', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'tapbfb32e9e-52', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:85:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbfb32e9e-52'}, 'message_id': '67d7f5d2-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.877164166, 'message_signature': '6293799f771740a27f35e335f34030eeea97972b354c751b600df438b1516045'}]}, 'timestamp': '2025-10-08 19:10:08.365830', '_unique_id': 'afa7813c34e44c07af560998d96bf7e4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e97ff52e-8119-4e95-8660-4df76dc6c25a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd-vda', 'timestamp': '2025-10-08T19:10:08.366960', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'instance-00000006', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '67d82b2e-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.914143417, 'message_signature': '1a9f66124ae09afc20ab885703c77702192820b0d46b6bbfacef391e780fd64c'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd-sda', 'timestamp': '2025-10-08T19:10:08.366960', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'instance-00000006', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '67d83344-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.914143417, 'message_signature': '7223313bf0682c50ba3d12bcdd5f3981814cc070b33fb62f2660d04ac7495591'}]}, 'timestamp': '2025-10-08 19:10:08.367382', '_unique_id': '4ec8a17cc55643648f19566188eb6caa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.368 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.368 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/cpu volume: 9220000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f7f6d6c6-0ec0-48e6-9bce-1f9c88a9572a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9220000000, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'timestamp': '2025-10-08T19:10:08.368457', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'instance-00000006', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '67d86580-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.962898662, 'message_signature': '037f550e0e47e4ae8a27015d8886bdb8fbfb2ed3a37266a0882b763017060f34'}]}, 'timestamp': '2025-10-08 19:10:08.368682', '_unique_id': 'fe307d5da9b343c69ec556e5ba953d7d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk.device.read.requests volume: 770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4d603454-6cf8-49d1-8512-6c59e249833f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 770, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd-vda', 'timestamp': '2025-10-08T19:10:08.369723', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'instance-00000006', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '67d896c2-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.884361186, 'message_signature': 'be2ab941aaeed56d98b2451da57746c09fb9f552b5013cfbf1a468980a903193'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd-sda', 'timestamp': '2025-10-08T19:10:08.369723', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'instance-00000006', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '67d89f50-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.884361186, 'message_signature': 'd6840f9f2dc274980c15dc9d1b58c229e9091e34981c6b70f1b6abe68deaa180'}]}, 'timestamp': '2025-10-08 19:10:08.370149', '_unique_id': '9f0dde9bb34f4af881daf59f1aa52ac5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '47a6b627-fc5e-4fb5-8652-1513110f5374', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-00000006-783f8889-2bc8-4641-bdb9-95ee4226a2fd-tapbfb32e9e-52', 'timestamp': '2025-10-08T19:10:08.371238', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'tapbfb32e9e-52', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:85:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbfb32e9e-52'}, 'message_id': '67d8d204-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.877164166, 'message_signature': '08651bd8d1094a3105cd4850ed5d07504701c5c1c4c69a710ff754f06fbe0a66'}]}, 'timestamp': '2025-10-08 19:10:08.371461', '_unique_id': '87dba970f3a74036b4c8677fd0696a0a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.372 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.372 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/network.incoming.bytes volume: 110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '415e37a7-a946-42bd-b269-28f8a4c62d43', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 110, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-00000006-783f8889-2bc8-4641-bdb9-95ee4226a2fd-tapbfb32e9e-52', 'timestamp': '2025-10-08T19:10:08.372489', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'tapbfb32e9e-52', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:85:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbfb32e9e-52'}, 'message_id': '67d902e2-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.877164166, 'message_signature': 'acffae1af86121058c849fc4c65e6666e7c67074728dfd06efe201a9da9224cf'}]}, 'timestamp': '2025-10-08 19:10:08.372712', '_unique_id': 'ee51f4c9b1194e6ca60025fbe5fa5023'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '517f211b-ed32-497f-b2ec-d1ffcb3e23f0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd-vda', 'timestamp': '2025-10-08T19:10:08.373739', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'instance-00000006', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '67d9338e-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.884361186, 'message_signature': '542216245f51a1271efc25227d94683acccf0707d58a0b4ab1a9a2c9869ac06f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd-sda', 'timestamp': '2025-10-08T19:10:08.373739', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'instance-00000006', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '67d93c3a-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.884361186, 'message_signature': 'b3222a8391f30d5307eb67567120fe721114191238020c244680afe52027fffc'}]}, 'timestamp': '2025-10-08 19:10:08.374165', '_unique_id': 'be1a80ae88bc4814acff129e670026c7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dc7ab9d5-19ff-404e-a6a5-f120317cb0be', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-00000006-783f8889-2bc8-4641-bdb9-95ee4226a2fd-tapbfb32e9e-52', 'timestamp': '2025-10-08T19:10:08.375251', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'tapbfb32e9e-52', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:85:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbfb32e9e-52'}, 'message_id': '67d96ebc-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.877164166, 'message_signature': '1070f57bc1646ac196e2e848dab3aead0fe3830b85c4671ae3215e174f35b1fd'}]}, 'timestamp': '2025-10-08 19:10:08.375473', '_unique_id': '6e628c5fa28f44f39f59615f90bad475'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:10:08 compute-0 nova_compute[117514]: 2025-10-08 19:10:08.406 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:10:08 compute-0 nova_compute[117514]: 2025-10-08 19:10:08.407 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:10:08 compute-0 nova_compute[117514]: 2025-10-08 19:10:08.407 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 08 19:10:08 compute-0 nova_compute[117514]: 2025-10-08 19:10:08.467 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 08 19:10:08 compute-0 nova_compute[117514]: 2025-10-08 19:10:08.467 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:10:08 compute-0 nova_compute[117514]: 2025-10-08 19:10:08.467 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 08 19:10:08 compute-0 nova_compute[117514]: 2025-10-08 19:10:08.718 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:10:09 compute-0 nova_compute[117514]: 2025-10-08 19:10:09.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:10:10 compute-0 nova_compute[117514]: 2025-10-08 19:10:10.929 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:10:11 compute-0 nova_compute[117514]: 2025-10-08 19:10:11.498 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:10:11 compute-0 podman[147547]: 2025-10-08 19:10:11.648720307 +0000 UTC m=+0.075476407 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 08 19:10:13 compute-0 ovn_controller[19759]: 2025-10-08T19:10:13Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:4e:85:2e 10.100.0.14
Oct 08 19:10:13 compute-0 ovn_controller[19759]: 2025-10-08T19:10:13Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:4e:85:2e 10.100.0.14
Oct 08 19:10:15 compute-0 nova_compute[117514]: 2025-10-08 19:10:15.930 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:10:16 compute-0 nova_compute[117514]: 2025-10-08 19:10:16.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:10:19 compute-0 nova_compute[117514]: 2025-10-08 19:10:19.314 2 INFO nova.compute.manager [None req-9d9e4250-7fda-4296-8a89-1c404f607141 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Get console output
Oct 08 19:10:19 compute-0 nova_compute[117514]: 2025-10-08 19:10:19.321 54 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Oct 08 19:10:20 compute-0 nova_compute[117514]: 2025-10-08 19:10:20.904 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:10:20 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:20.905 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a6:75:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '5e:14:dd:63:55:2a'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 08 19:10:20 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:20.906 28643 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 08 19:10:20 compute-0 nova_compute[117514]: 2025-10-08 19:10:20.932 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:10:21 compute-0 nova_compute[117514]: 2025-10-08 19:10:21.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:10:21 compute-0 podman[147574]: 2025-10-08 19:10:21.672313611 +0000 UTC m=+0.089306981 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 08 19:10:21 compute-0 nova_compute[117514]: 2025-10-08 19:10:21.937 2 DEBUG oslo_concurrency.lockutils [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "interface-783f8889-2bc8-4641-bdb9-95ee4226a2fd-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:10:21 compute-0 nova_compute[117514]: 2025-10-08 19:10:21.938 2 DEBUG oslo_concurrency.lockutils [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "interface-783f8889-2bc8-4641-bdb9-95ee4226a2fd-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:10:21 compute-0 nova_compute[117514]: 2025-10-08 19:10:21.939 2 DEBUG nova.objects.instance [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'flavor' on Instance uuid 783f8889-2bc8-4641-bdb9-95ee4226a2fd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 08 19:10:23 compute-0 nova_compute[117514]: 2025-10-08 19:10:23.118 2 DEBUG nova.objects.instance [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'pci_requests' on Instance uuid 783f8889-2bc8-4641-bdb9-95ee4226a2fd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 08 19:10:23 compute-0 nova_compute[117514]: 2025-10-08 19:10:23.133 2 DEBUG nova.network.neutron [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Oct 08 19:10:24 compute-0 nova_compute[117514]: 2025-10-08 19:10:24.158 2 DEBUG nova.policy [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Oct 08 19:10:25 compute-0 nova_compute[117514]: 2025-10-08 19:10:25.959 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:10:26 compute-0 nova_compute[117514]: 2025-10-08 19:10:26.215 2 DEBUG nova.network.neutron [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Successfully created port: ea81e5cb-74ba-43da-a780-3f1f699fa0d6 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Oct 08 19:10:26 compute-0 nova_compute[117514]: 2025-10-08 19:10:26.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:10:27 compute-0 nova_compute[117514]: 2025-10-08 19:10:27.301 2 DEBUG nova.network.neutron [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Successfully updated port: ea81e5cb-74ba-43da-a780-3f1f699fa0d6 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Oct 08 19:10:27 compute-0 nova_compute[117514]: 2025-10-08 19:10:27.320 2 DEBUG oslo_concurrency.lockutils [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "refresh_cache-783f8889-2bc8-4641-bdb9-95ee4226a2fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:10:27 compute-0 nova_compute[117514]: 2025-10-08 19:10:27.320 2 DEBUG oslo_concurrency.lockutils [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquired lock "refresh_cache-783f8889-2bc8-4641-bdb9-95ee4226a2fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:10:27 compute-0 nova_compute[117514]: 2025-10-08 19:10:27.320 2 DEBUG nova.network.neutron [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 08 19:10:27 compute-0 nova_compute[117514]: 2025-10-08 19:10:27.455 2 DEBUG nova.compute.manager [req-73b66b7a-2084-400a-be0c-e954fb22f600 req-887a3fc2-4baa-498d-8aa7-9eba7dcdd365 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Received event network-changed-ea81e5cb-74ba-43da-a780-3f1f699fa0d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:10:27 compute-0 nova_compute[117514]: 2025-10-08 19:10:27.456 2 DEBUG nova.compute.manager [req-73b66b7a-2084-400a-be0c-e954fb22f600 req-887a3fc2-4baa-498d-8aa7-9eba7dcdd365 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Refreshing instance network info cache due to event network-changed-ea81e5cb-74ba-43da-a780-3f1f699fa0d6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 08 19:10:27 compute-0 nova_compute[117514]: 2025-10-08 19:10:27.456 2 DEBUG oslo_concurrency.lockutils [req-73b66b7a-2084-400a-be0c-e954fb22f600 req-887a3fc2-4baa-498d-8aa7-9eba7dcdd365 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-783f8889-2bc8-4641-bdb9-95ee4226a2fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:10:27 compute-0 podman[147595]: 2025-10-08 19:10:27.66197255 +0000 UTC m=+0.073628133 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.license=GPLv2, container_name=multipathd, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 08 19:10:27 compute-0 podman[147594]: 2025-10-08 19:10:27.675004011 +0000 UTC m=+0.083904344 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., distribution-scope=public, config_id=edpm, io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, managed_by=edpm_ansible, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct 08 19:10:28 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:28.909 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f81f7a-64d8-418a-a74c-b879bd6deb83, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.139 2 DEBUG nova.network.neutron [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Updating instance_info_cache with network_info: [{"id": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "address": "fa:16:3e:4e:85:2e", "network": {"id": "0d073e98-c9f2-4b90-8237-84ff2fa99090", "bridge": "br-int", "label": "tempest-network-smoke--1785011615", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb32e9e-52", "ovs_interfaceid": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ea81e5cb-74ba-43da-a780-3f1f699fa0d6", "address": "fa:16:3e:11:ac:ba", "network": {"id": "e3bc67bd-dd21-4701-b445-33eb52179602", "bridge": "br-int", "label": "tempest-network-smoke--1834937033", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea81e5cb-74", "ovs_interfaceid": "ea81e5cb-74ba-43da-a780-3f1f699fa0d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.162 2 DEBUG oslo_concurrency.lockutils [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Releasing lock "refresh_cache-783f8889-2bc8-4641-bdb9-95ee4226a2fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.164 2 DEBUG oslo_concurrency.lockutils [req-73b66b7a-2084-400a-be0c-e954fb22f600 req-887a3fc2-4baa-498d-8aa7-9eba7dcdd365 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-783f8889-2bc8-4641-bdb9-95ee4226a2fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.164 2 DEBUG nova.network.neutron [req-73b66b7a-2084-400a-be0c-e954fb22f600 req-887a3fc2-4baa-498d-8aa7-9eba7dcdd365 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Refreshing network info cache for port ea81e5cb-74ba-43da-a780-3f1f699fa0d6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 08 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.169 2 DEBUG nova.virt.libvirt.vif [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T19:09:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1641480242',display_name='tempest-TestNetworkBasicOps-server-1641480242',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1641480242',id=6,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPSyEE+QeB2DOtd7xoaY+J9mVl+DzPE43UDhso7eEGO9aQXs3wmj/YcqHfJ97lRUVFOa3dbwNiIUyunSI3DyzjQf/v6cjCZ2KkxRD0GJnQ0zRM5omnXaZRnz3Bq5VONa9g==',key_name='tempest-TestNetworkBasicOps-1535027603',keypairs=<?>,launch_index=0,launched_at=2025-10-08T19:09:58Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-pbt1zket',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T19:09:58Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=783f8889-2bc8-4641-bdb9-95ee4226a2fd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ea81e5cb-74ba-43da-a780-3f1f699fa0d6", "address": "fa:16:3e:11:ac:ba", "network": {"id": "e3bc67bd-dd21-4701-b445-33eb52179602", "bridge": "br-int", "label": "tempest-network-smoke--1834937033", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea81e5cb-74", "ovs_interfaceid": "ea81e5cb-74ba-43da-a780-3f1f699fa0d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Oct 08 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.169 2 DEBUG nova.network.os_vif_util [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "ea81e5cb-74ba-43da-a780-3f1f699fa0d6", "address": "fa:16:3e:11:ac:ba", "network": {"id": "e3bc67bd-dd21-4701-b445-33eb52179602", "bridge": "br-int", "label": "tempest-network-smoke--1834937033", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea81e5cb-74", "ovs_interfaceid": "ea81e5cb-74ba-43da-a780-3f1f699fa0d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 08 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.170 2 DEBUG nova.network.os_vif_util [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:11:ac:ba,bridge_name='br-int',has_traffic_filtering=True,id=ea81e5cb-74ba-43da-a780-3f1f699fa0d6,network=Network(e3bc67bd-dd21-4701-b445-33eb52179602),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea81e5cb-74') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 08 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.171 2 DEBUG os_vif [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:11:ac:ba,bridge_name='br-int',has_traffic_filtering=True,id=ea81e5cb-74ba-43da-a780-3f1f699fa0d6,network=Network(e3bc67bd-dd21-4701-b445-33eb52179602),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea81e5cb-74') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Oct 08 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.172 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.173 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.173 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 08 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.176 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.177 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapea81e5cb-74, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.178 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapea81e5cb-74, col_values=(('external_ids', {'iface-id': 'ea81e5cb-74ba-43da-a780-3f1f699fa0d6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:11:ac:ba', 'vm-uuid': '783f8889-2bc8-4641-bdb9-95ee4226a2fd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:10:30 compute-0 NetworkManager[1035]: <info>  [1759950630.1815] manager: (tapea81e5cb-74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/57)
Oct 08 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.185 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.189 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 08 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.189 2 INFO os_vif [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:11:ac:ba,bridge_name='br-int',has_traffic_filtering=True,id=ea81e5cb-74ba-43da-a780-3f1f699fa0d6,network=Network(e3bc67bd-dd21-4701-b445-33eb52179602),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea81e5cb-74')
Oct 08 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.191 2 DEBUG nova.virt.libvirt.vif [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T19:09:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1641480242',display_name='tempest-TestNetworkBasicOps-server-1641480242',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1641480242',id=6,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPSyEE+QeB2DOtd7xoaY+J9mVl+DzPE43UDhso7eEGO9aQXs3wmj/YcqHfJ97lRUVFOa3dbwNiIUyunSI3DyzjQf/v6cjCZ2KkxRD0GJnQ0zRM5omnXaZRnz3Bq5VONa9g==',key_name='tempest-TestNetworkBasicOps-1535027603',keypairs=<?>,launch_index=0,launched_at=2025-10-08T19:09:58Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-pbt1zket',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T19:09:58Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=783f8889-2bc8-4641-bdb9-95ee4226a2fd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ea81e5cb-74ba-43da-a780-3f1f699fa0d6", "address": "fa:16:3e:11:ac:ba", "network": {"id": "e3bc67bd-dd21-4701-b445-33eb52179602", "bridge": "br-int", "label": "tempest-network-smoke--1834937033", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea81e5cb-74", "ovs_interfaceid": "ea81e5cb-74ba-43da-a780-3f1f699fa0d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Oct 08 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.191 2 DEBUG nova.network.os_vif_util [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "ea81e5cb-74ba-43da-a780-3f1f699fa0d6", "address": "fa:16:3e:11:ac:ba", "network": {"id": "e3bc67bd-dd21-4701-b445-33eb52179602", "bridge": "br-int", "label": "tempest-network-smoke--1834937033", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea81e5cb-74", "ovs_interfaceid": "ea81e5cb-74ba-43da-a780-3f1f699fa0d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 08 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.192 2 DEBUG nova.network.os_vif_util [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:11:ac:ba,bridge_name='br-int',has_traffic_filtering=True,id=ea81e5cb-74ba-43da-a780-3f1f699fa0d6,network=Network(e3bc67bd-dd21-4701-b445-33eb52179602),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea81e5cb-74') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 08 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.195 2 DEBUG nova.virt.libvirt.guest [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] attach device xml: <interface type="ethernet">
Oct 08 19:10:30 compute-0 nova_compute[117514]:   <mac address="fa:16:3e:11:ac:ba"/>
Oct 08 19:10:30 compute-0 nova_compute[117514]:   <model type="virtio"/>
Oct 08 19:10:30 compute-0 nova_compute[117514]:   <driver name="vhost" rx_queue_size="512"/>
Oct 08 19:10:30 compute-0 nova_compute[117514]:   <mtu size="1442"/>
Oct 08 19:10:30 compute-0 nova_compute[117514]:   <target dev="tapea81e5cb-74"/>
Oct 08 19:10:30 compute-0 nova_compute[117514]: </interface>
Oct 08 19:10:30 compute-0 nova_compute[117514]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339
Oct 08 19:10:30 compute-0 kernel: tapea81e5cb-74: entered promiscuous mode
Oct 08 19:10:30 compute-0 NetworkManager[1035]: <info>  [1759950630.2102] manager: (tapea81e5cb-74): new Tun device (/org/freedesktop/NetworkManager/Devices/58)
Oct 08 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:10:30 compute-0 ovn_controller[19759]: 2025-10-08T19:10:30Z|00089|binding|INFO|Claiming lport ea81e5cb-74ba-43da-a780-3f1f699fa0d6 for this chassis.
Oct 08 19:10:30 compute-0 ovn_controller[19759]: 2025-10-08T19:10:30Z|00090|binding|INFO|ea81e5cb-74ba-43da-a780-3f1f699fa0d6: Claiming fa:16:3e:11:ac:ba 10.100.0.22
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.219 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:11:ac:ba 10.100.0.22'], port_security=['fa:16:3e:11:ac:ba 10.100.0.22'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.22/28', 'neutron:device_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e3bc67bd-dd21-4701-b445-33eb52179602', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'be57f10c-6afc-483d-a1fa-fab953b8fe3e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=325bd26c-56bb-4683-8b62-92cc8f266207, chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=ea81e5cb-74ba-43da-a780-3f1f699fa0d6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.220 28643 INFO neutron.agent.ovn.metadata.agent [-] Port ea81e5cb-74ba-43da-a780-3f1f699fa0d6 in datapath e3bc67bd-dd21-4701-b445-33eb52179602 bound to our chassis
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.222 28643 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e3bc67bd-dd21-4701-b445-33eb52179602
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.237 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[a55737fd-65e8-4a8b-9130-f3f19e781f7e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.238 28643 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape3bc67bd-d1 in ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.242 144726 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape3bc67bd-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.242 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[c907d04f-59e0-4dcd-8867-bdef60d54469]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.244 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[55d87543-9be8-4a5c-840f-987357fff54e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:10:30 compute-0 systemd-udevd[147643]: Network interface NamePolicy= disabled on kernel command line.
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.257 28783 DEBUG oslo.privsep.daemon [-] privsep: reply[80c8abc4-cf99-4fdd-8187-c723aafcc9a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:10:30 compute-0 ovn_controller[19759]: 2025-10-08T19:10:30Z|00091|binding|INFO|Setting lport ea81e5cb-74ba-43da-a780-3f1f699fa0d6 ovn-installed in OVS
Oct 08 19:10:30 compute-0 ovn_controller[19759]: 2025-10-08T19:10:30Z|00092|binding|INFO|Setting lport ea81e5cb-74ba-43da-a780-3f1f699fa0d6 up in Southbound
Oct 08 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.272 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:10:30 compute-0 NetworkManager[1035]: <info>  [1759950630.2752] device (tapea81e5cb-74): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 08 19:10:30 compute-0 NetworkManager[1035]: <info>  [1759950630.2780] device (tapea81e5cb-74): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.279 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[84627644-815c-4bec-a595-27a17d979b8a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.327 2 DEBUG nova.virt.libvirt.driver [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.326 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[6de15016-427b-47c0-9799-60e0799f1308]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.327 2 DEBUG nova.virt.libvirt.driver [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 08 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.327 2 DEBUG nova.virt.libvirt.driver [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No VIF found with MAC fa:16:3e:4e:85:2e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Oct 08 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.328 2 DEBUG nova.virt.libvirt.driver [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No VIF found with MAC fa:16:3e:11:ac:ba, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Oct 08 19:10:30 compute-0 NetworkManager[1035]: <info>  [1759950630.3350] manager: (tape3bc67bd-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/59)
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.334 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[85a6f7ca-1c0a-4abd-ad98-a6eaf188ff4e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.374 2 DEBUG nova.virt.libvirt.guest [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 08 19:10:30 compute-0 nova_compute[117514]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct 08 19:10:30 compute-0 nova_compute[117514]:   <nova:name>tempest-TestNetworkBasicOps-server-1641480242</nova:name>
Oct 08 19:10:30 compute-0 nova_compute[117514]:   <nova:creationTime>2025-10-08 19:10:30</nova:creationTime>
Oct 08 19:10:30 compute-0 nova_compute[117514]:   <nova:flavor name="m1.nano">
Oct 08 19:10:30 compute-0 nova_compute[117514]:     <nova:memory>128</nova:memory>
Oct 08 19:10:30 compute-0 nova_compute[117514]:     <nova:disk>1</nova:disk>
Oct 08 19:10:30 compute-0 nova_compute[117514]:     <nova:swap>0</nova:swap>
Oct 08 19:10:30 compute-0 nova_compute[117514]:     <nova:ephemeral>0</nova:ephemeral>
Oct 08 19:10:30 compute-0 nova_compute[117514]:     <nova:vcpus>1</nova:vcpus>
Oct 08 19:10:30 compute-0 nova_compute[117514]:   </nova:flavor>
Oct 08 19:10:30 compute-0 nova_compute[117514]:   <nova:owner>
Oct 08 19:10:30 compute-0 nova_compute[117514]:     <nova:user uuid="efdb1424acdb478684cdb088b373ba05">tempest-TestNetworkBasicOps-1122149477-project-member</nova:user>
Oct 08 19:10:30 compute-0 nova_compute[117514]:     <nova:project uuid="b7f7c752a9c5498f8eda73e461895ac9">tempest-TestNetworkBasicOps-1122149477</nova:project>
Oct 08 19:10:30 compute-0 nova_compute[117514]:   </nova:owner>
Oct 08 19:10:30 compute-0 nova_compute[117514]:   <nova:root type="image" uuid="23cfa426-7011-4566-992d-1c7af39f70dd"/>
Oct 08 19:10:30 compute-0 nova_compute[117514]:   <nova:ports>
Oct 08 19:10:30 compute-0 nova_compute[117514]:     <nova:port uuid="bfb32e9e-52b6-4043-b9a6-129d11fa2814">
Oct 08 19:10:30 compute-0 nova_compute[117514]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Oct 08 19:10:30 compute-0 nova_compute[117514]:     </nova:port>
Oct 08 19:10:30 compute-0 nova_compute[117514]:     <nova:port uuid="ea81e5cb-74ba-43da-a780-3f1f699fa0d6">
Oct 08 19:10:30 compute-0 nova_compute[117514]:       <nova:ip type="fixed" address="10.100.0.22" ipVersion="4"/>
Oct 08 19:10:30 compute-0 nova_compute[117514]:     </nova:port>
Oct 08 19:10:30 compute-0 nova_compute[117514]:   </nova:ports>
Oct 08 19:10:30 compute-0 nova_compute[117514]: </nova:instance>
Oct 08 19:10:30 compute-0 nova_compute[117514]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.399 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[139d505c-4552-4f51-9b58-ddda5401b8d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.405 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[878301e3-c867-42b7-96ba-5c811cdd3cb0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.411 2 DEBUG oslo_concurrency.lockutils [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "interface-783f8889-2bc8-4641-bdb9-95ee4226a2fd-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 8.474s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:10:30 compute-0 NetworkManager[1035]: <info>  [1759950630.4390] device (tape3bc67bd-d0): carrier: link connected
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.444 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[95ff4254-4085-4d3d-afc3-2207a81ab8f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.472 2 DEBUG nova.compute.manager [req-e8678dfb-f035-4c70-83b4-67f3ac0735d2 req-0b4d8d3d-dc2b-4cf9-9a53-8fe6ca67fd4a bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Received event network-vif-plugged-ea81e5cb-74ba-43da-a780-3f1f699fa0d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.472 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[1a24a715-8530-4812-a83e-da8d6385f160]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape3bc67bd-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5d:4f:d1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 130101, 'reachable_time': 29029, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 147668, 'error': None, 'target': 'ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.473 2 DEBUG oslo_concurrency.lockutils [req-e8678dfb-f035-4c70-83b4-67f3ac0735d2 req-0b4d8d3d-dc2b-4cf9-9a53-8fe6ca67fd4a bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.473 2 DEBUG oslo_concurrency.lockutils [req-e8678dfb-f035-4c70-83b4-67f3ac0735d2 req-0b4d8d3d-dc2b-4cf9-9a53-8fe6ca67fd4a bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.474 2 DEBUG oslo_concurrency.lockutils [req-e8678dfb-f035-4c70-83b4-67f3ac0735d2 req-0b4d8d3d-dc2b-4cf9-9a53-8fe6ca67fd4a bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.474 2 DEBUG nova.compute.manager [req-e8678dfb-f035-4c70-83b4-67f3ac0735d2 req-0b4d8d3d-dc2b-4cf9-9a53-8fe6ca67fd4a bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] No waiting events found dispatching network-vif-plugged-ea81e5cb-74ba-43da-a780-3f1f699fa0d6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 08 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.475 2 WARNING nova.compute.manager [req-e8678dfb-f035-4c70-83b4-67f3ac0735d2 req-0b4d8d3d-dc2b-4cf9-9a53-8fe6ca67fd4a bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Received unexpected event network-vif-plugged-ea81e5cb-74ba-43da-a780-3f1f699fa0d6 for instance with vm_state active and task_state None.
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.497 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[72df0f75-e409-4fa8-ba9b-b44c8781df77]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5d:4fd1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 130101, 'tstamp': 130101}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 147669, 'error': None, 'target': 'ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.520 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[0a4ab9b3-45fe-4b4b-aee4-e7da262d8712]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape3bc67bd-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5d:4f:d1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 130101, 'reachable_time': 29029, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 147670, 'error': None, 'target': 'ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.567 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[b0941d90-61a0-4733-b1f2-a81d6a710edf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.659 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[d001dad3-da35-4d79-8d31-2a38f447cb83]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.662 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape3bc67bd-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.663 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.663 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape3bc67bd-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.666 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:10:30 compute-0 NetworkManager[1035]: <info>  [1759950630.6671] manager: (tape3bc67bd-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/60)
Oct 08 19:10:30 compute-0 kernel: tape3bc67bd-d0: entered promiscuous mode
Oct 08 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.671 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape3bc67bd-d0, col_values=(('external_ids', {'iface-id': 'd935682a-e42a-4970-b54c-b54c616cf798'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:10:30 compute-0 ovn_controller[19759]: 2025-10-08T19:10:30Z|00093|binding|INFO|Releasing lport d935682a-e42a-4970-b54c-b54c616cf798 from this chassis (sb_readonly=0)
Oct 08 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.689 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.691 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.692 28643 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e3bc67bd-dd21-4701-b445-33eb52179602.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e3bc67bd-dd21-4701-b445-33eb52179602.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.693 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[3e4c012b-71c5-45b3-8ab9-d7a96c426bb1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.694 28643 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]: global
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]:     log         /dev/log local0 debug
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]:     log-tag     haproxy-metadata-proxy-e3bc67bd-dd21-4701-b445-33eb52179602
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]:     user        root
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]:     group       root
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]:     maxconn     1024
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]:     pidfile     /var/lib/neutron/external/pids/e3bc67bd-dd21-4701-b445-33eb52179602.pid.haproxy
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]:     daemon
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]: 
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]: defaults
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]:     log global
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]:     mode http
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]:     option httplog
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]:     option dontlognull
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]:     option http-server-close
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]:     option forwardfor
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]:     retries                 3
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]:     timeout http-request    30s
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]:     timeout connect         30s
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]:     timeout client          32s
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]:     timeout server          32s
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]:     timeout http-keep-alive 30s
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]: 
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]: 
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]: listen listener
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]:     bind 169.254.169.254:80
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]:     server metadata /var/lib/neutron/metadata_proxy
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]:     http-request add-header X-OVN-Network-ID e3bc67bd-dd21-4701-b445-33eb52179602
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Oct 08 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.695 28643 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602', 'env', 'PROCESS_TAG=haproxy-e3bc67bd-dd21-4701-b445-33eb52179602', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e3bc67bd-dd21-4701-b445-33eb52179602.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Oct 08 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.963 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:10:31 compute-0 podman[147702]: 2025-10-08 19:10:31.138712833 +0000 UTC m=+0.075484598 container create 0263a6d21769e3c38d37f8ef90b039ce1a54c225bd0e154d3b91c06548a70e40 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0)
Oct 08 19:10:31 compute-0 systemd[1]: Started libpod-conmon-0263a6d21769e3c38d37f8ef90b039ce1a54c225bd0e154d3b91c06548a70e40.scope.
Oct 08 19:10:31 compute-0 podman[147702]: 2025-10-08 19:10:31.099022383 +0000 UTC m=+0.035794188 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7
Oct 08 19:10:31 compute-0 systemd[1]: Started libcrun container.
Oct 08 19:10:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dbb48eabd75c06324d43161ee30ba9aa731d2058597c747932479a573242f471/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 08 19:10:31 compute-0 podman[147702]: 2025-10-08 19:10:31.217889697 +0000 UTC m=+0.154661462 container init 0263a6d21769e3c38d37f8ef90b039ce1a54c225bd0e154d3b91c06548a70e40 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team)
Oct 08 19:10:31 compute-0 podman[147702]: 2025-10-08 19:10:31.225014015 +0000 UTC m=+0.161785750 container start 0263a6d21769e3c38d37f8ef90b039ce1a54c225bd0e154d3b91c06548a70e40 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team)
Oct 08 19:10:31 compute-0 neutron-haproxy-ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602[147719]: [NOTICE]   (147733) : New worker (147741) forked
Oct 08 19:10:31 compute-0 neutron-haproxy-ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602[147719]: [NOTICE]   (147733) : Loading success.
Oct 08 19:10:31 compute-0 podman[147716]: 2025-10-08 19:10:31.259437071 +0000 UTC m=+0.079953688 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 08 19:10:31 compute-0 nova_compute[117514]: 2025-10-08 19:10:31.395 2 DEBUG nova.network.neutron [req-73b66b7a-2084-400a-be0c-e954fb22f600 req-887a3fc2-4baa-498d-8aa7-9eba7dcdd365 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Updated VIF entry in instance network info cache for port ea81e5cb-74ba-43da-a780-3f1f699fa0d6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 08 19:10:31 compute-0 nova_compute[117514]: 2025-10-08 19:10:31.396 2 DEBUG nova.network.neutron [req-73b66b7a-2084-400a-be0c-e954fb22f600 req-887a3fc2-4baa-498d-8aa7-9eba7dcdd365 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Updating instance_info_cache with network_info: [{"id": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "address": "fa:16:3e:4e:85:2e", "network": {"id": "0d073e98-c9f2-4b90-8237-84ff2fa99090", "bridge": "br-int", "label": "tempest-network-smoke--1785011615", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb32e9e-52", "ovs_interfaceid": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ea81e5cb-74ba-43da-a780-3f1f699fa0d6", "address": "fa:16:3e:11:ac:ba", "network": {"id": "e3bc67bd-dd21-4701-b445-33eb52179602", "bridge": "br-int", "label": "tempest-network-smoke--1834937033", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea81e5cb-74", "ovs_interfaceid": "ea81e5cb-74ba-43da-a780-3f1f699fa0d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:10:31 compute-0 ovn_controller[19759]: 2025-10-08T19:10:31Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:11:ac:ba 10.100.0.22
Oct 08 19:10:31 compute-0 ovn_controller[19759]: 2025-10-08T19:10:31Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:11:ac:ba 10.100.0.22
Oct 08 19:10:31 compute-0 nova_compute[117514]: 2025-10-08 19:10:31.412 2 DEBUG oslo_concurrency.lockutils [req-73b66b7a-2084-400a-be0c-e954fb22f600 req-887a3fc2-4baa-498d-8aa7-9eba7dcdd365 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-783f8889-2bc8-4641-bdb9-95ee4226a2fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:10:32 compute-0 nova_compute[117514]: 2025-10-08 19:10:32.559 2 DEBUG nova.compute.manager [req-9d8caff5-883f-4c82-864a-561afd01d1cc req-8eb2906a-7337-4758-a46f-d8188665bea0 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Received event network-vif-plugged-ea81e5cb-74ba-43da-a780-3f1f699fa0d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:10:32 compute-0 nova_compute[117514]: 2025-10-08 19:10:32.560 2 DEBUG oslo_concurrency.lockutils [req-9d8caff5-883f-4c82-864a-561afd01d1cc req-8eb2906a-7337-4758-a46f-d8188665bea0 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:10:32 compute-0 nova_compute[117514]: 2025-10-08 19:10:32.560 2 DEBUG oslo_concurrency.lockutils [req-9d8caff5-883f-4c82-864a-561afd01d1cc req-8eb2906a-7337-4758-a46f-d8188665bea0 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:10:32 compute-0 nova_compute[117514]: 2025-10-08 19:10:32.561 2 DEBUG oslo_concurrency.lockutils [req-9d8caff5-883f-4c82-864a-561afd01d1cc req-8eb2906a-7337-4758-a46f-d8188665bea0 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:10:32 compute-0 nova_compute[117514]: 2025-10-08 19:10:32.561 2 DEBUG nova.compute.manager [req-9d8caff5-883f-4c82-864a-561afd01d1cc req-8eb2906a-7337-4758-a46f-d8188665bea0 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] No waiting events found dispatching network-vif-plugged-ea81e5cb-74ba-43da-a780-3f1f699fa0d6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 08 19:10:32 compute-0 nova_compute[117514]: 2025-10-08 19:10:32.561 2 WARNING nova.compute.manager [req-9d8caff5-883f-4c82-864a-561afd01d1cc req-8eb2906a-7337-4758-a46f-d8188665bea0 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Received unexpected event network-vif-plugged-ea81e5cb-74ba-43da-a780-3f1f699fa0d6 for instance with vm_state active and task_state None.
Oct 08 19:10:33 compute-0 podman[147757]: 2025-10-08 19:10:33.674582698 +0000 UTC m=+0.069918645 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Oct 08 19:10:33 compute-0 podman[147755]: 2025-10-08 19:10:33.683730785 +0000 UTC m=+0.088765445 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct 08 19:10:33 compute-0 podman[147756]: 2025-10-08 19:10:33.733898741 +0000 UTC m=+0.139698324 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct 08 19:10:35 compute-0 nova_compute[117514]: 2025-10-08 19:10:35.182 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:10:35 compute-0 nova_compute[117514]: 2025-10-08 19:10:35.965 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:10:40 compute-0 nova_compute[117514]: 2025-10-08 19:10:40.185 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:10:40 compute-0 nova_compute[117514]: 2025-10-08 19:10:40.968 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:10:42 compute-0 podman[147817]: 2025-10-08 19:10:42.636963316 +0000 UTC m=+0.059150213 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 08 19:10:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:44.231 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:10:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:44.232 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:10:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:44.233 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:10:44 compute-0 nova_compute[117514]: 2025-10-08 19:10:44.624 2 DEBUG oslo_concurrency.lockutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "6af51230-93a7-45ef-9a1e-c47302f43bcf" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:10:44 compute-0 nova_compute[117514]: 2025-10-08 19:10:44.625 2 DEBUG oslo_concurrency.lockutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "6af51230-93a7-45ef-9a1e-c47302f43bcf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:10:44 compute-0 nova_compute[117514]: 2025-10-08 19:10:44.643 2 DEBUG nova.compute.manager [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Oct 08 19:10:44 compute-0 nova_compute[117514]: 2025-10-08 19:10:44.727 2 DEBUG oslo_concurrency.lockutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:10:44 compute-0 nova_compute[117514]: 2025-10-08 19:10:44.727 2 DEBUG oslo_concurrency.lockutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:10:44 compute-0 nova_compute[117514]: 2025-10-08 19:10:44.737 2 DEBUG nova.virt.hardware [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Oct 08 19:10:44 compute-0 nova_compute[117514]: 2025-10-08 19:10:44.738 2 INFO nova.compute.claims [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Claim successful on node compute-0.ctlplane.example.com
Oct 08 19:10:44 compute-0 nova_compute[117514]: 2025-10-08 19:10:44.891 2 DEBUG nova.compute.provider_tree [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 08 19:10:44 compute-0 nova_compute[117514]: 2025-10-08 19:10:44.907 2 DEBUG nova.scheduler.client.report [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 08 19:10:44 compute-0 nova_compute[117514]: 2025-10-08 19:10:44.939 2 DEBUG oslo_concurrency.lockutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.212s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:10:44 compute-0 nova_compute[117514]: 2025-10-08 19:10:44.940 2 DEBUG nova.compute.manager [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Oct 08 19:10:44 compute-0 nova_compute[117514]: 2025-10-08 19:10:44.994 2 DEBUG nova.compute.manager [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Oct 08 19:10:44 compute-0 nova_compute[117514]: 2025-10-08 19:10:44.995 2 DEBUG nova.network.neutron [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Oct 08 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.014 2 INFO nova.virt.libvirt.driver [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 08 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.035 2 DEBUG nova.compute.manager [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Oct 08 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.123 2 DEBUG nova.compute.manager [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Oct 08 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.125 2 DEBUG nova.virt.libvirt.driver [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Oct 08 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.126 2 INFO nova.virt.libvirt.driver [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Creating image(s)
Oct 08 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.127 2 DEBUG oslo_concurrency.lockutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "/var/lib/nova/instances/6af51230-93a7-45ef-9a1e-c47302f43bcf/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.128 2 DEBUG oslo_concurrency.lockutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "/var/lib/nova/instances/6af51230-93a7-45ef-9a1e-c47302f43bcf/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.129 2 DEBUG oslo_concurrency.lockutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "/var/lib/nova/instances/6af51230-93a7-45ef-9a1e-c47302f43bcf/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.153 2 DEBUG oslo_concurrency.processutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.229 2 DEBUG oslo_concurrency.processutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.231 2 DEBUG oslo_concurrency.lockutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "008eb3078b811ee47058b7252a820910c35fc6df" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.232 2 DEBUG oslo_concurrency.lockutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.255 2 DEBUG oslo_concurrency.processutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.322 2 DEBUG oslo_concurrency.processutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.323 2 DEBUG oslo_concurrency.processutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df,backing_fmt=raw /var/lib/nova/instances/6af51230-93a7-45ef-9a1e-c47302f43bcf/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.375 2 DEBUG oslo_concurrency.processutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df,backing_fmt=raw /var/lib/nova/instances/6af51230-93a7-45ef-9a1e-c47302f43bcf/disk 1073741824" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.376 2 DEBUG oslo_concurrency.lockutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.377 2 DEBUG oslo_concurrency.processutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.439 2 DEBUG oslo_concurrency.processutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.440 2 DEBUG nova.virt.disk.api [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Checking if we can resize image /var/lib/nova/instances/6af51230-93a7-45ef-9a1e-c47302f43bcf/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Oct 08 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.441 2 DEBUG oslo_concurrency.processutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6af51230-93a7-45ef-9a1e-c47302f43bcf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.505 2 DEBUG oslo_concurrency.processutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6af51230-93a7-45ef-9a1e-c47302f43bcf/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.506 2 DEBUG nova.virt.disk.api [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Cannot resize image /var/lib/nova/instances/6af51230-93a7-45ef-9a1e-c47302f43bcf/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Oct 08 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.507 2 DEBUG nova.objects.instance [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'migration_context' on Instance uuid 6af51230-93a7-45ef-9a1e-c47302f43bcf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 08 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.529 2 DEBUG nova.virt.libvirt.driver [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Oct 08 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.530 2 DEBUG nova.virt.libvirt.driver [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Ensure instance console log exists: /var/lib/nova/instances/6af51230-93a7-45ef-9a1e-c47302f43bcf/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Oct 08 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.531 2 DEBUG oslo_concurrency.lockutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.531 2 DEBUG oslo_concurrency.lockutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.532 2 DEBUG oslo_concurrency.lockutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.970 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:10:46 compute-0 nova_compute[117514]: 2025-10-08 19:10:46.137 2 DEBUG nova.policy [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Oct 08 19:10:47 compute-0 nova_compute[117514]: 2025-10-08 19:10:47.719 2 DEBUG nova.network.neutron [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Successfully created port: 062b16e8-3c3b-4520-b0f8-536d588db2f5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Oct 08 19:10:49 compute-0 nova_compute[117514]: 2025-10-08 19:10:49.242 2 DEBUG nova.network.neutron [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Successfully updated port: 062b16e8-3c3b-4520-b0f8-536d588db2f5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Oct 08 19:10:49 compute-0 nova_compute[117514]: 2025-10-08 19:10:49.262 2 DEBUG oslo_concurrency.lockutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "refresh_cache-6af51230-93a7-45ef-9a1e-c47302f43bcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:10:49 compute-0 nova_compute[117514]: 2025-10-08 19:10:49.262 2 DEBUG oslo_concurrency.lockutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquired lock "refresh_cache-6af51230-93a7-45ef-9a1e-c47302f43bcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:10:49 compute-0 nova_compute[117514]: 2025-10-08 19:10:49.263 2 DEBUG nova.network.neutron [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 08 19:10:49 compute-0 nova_compute[117514]: 2025-10-08 19:10:49.380 2 DEBUG nova.compute.manager [req-64a2a93f-5a7b-44ee-9876-16e5bef090f3 req-ebe6251b-bdb6-4253-adb9-fd5abb039c97 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Received event network-changed-062b16e8-3c3b-4520-b0f8-536d588db2f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:10:49 compute-0 nova_compute[117514]: 2025-10-08 19:10:49.381 2 DEBUG nova.compute.manager [req-64a2a93f-5a7b-44ee-9876-16e5bef090f3 req-ebe6251b-bdb6-4253-adb9-fd5abb039c97 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Refreshing instance network info cache due to event network-changed-062b16e8-3c3b-4520-b0f8-536d588db2f5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 08 19:10:49 compute-0 nova_compute[117514]: 2025-10-08 19:10:49.381 2 DEBUG oslo_concurrency.lockutils [req-64a2a93f-5a7b-44ee-9876-16e5bef090f3 req-ebe6251b-bdb6-4253-adb9-fd5abb039c97 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-6af51230-93a7-45ef-9a1e-c47302f43bcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:10:50 compute-0 nova_compute[117514]: 2025-10-08 19:10:50.109 2 DEBUG nova.network.neutron [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Oct 08 19:10:50 compute-0 nova_compute[117514]: 2025-10-08 19:10:50.281 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:10:50 compute-0 nova_compute[117514]: 2025-10-08 19:10:50.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.144 2 DEBUG nova.network.neutron [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Updating instance_info_cache with network_info: [{"id": "062b16e8-3c3b-4520-b0f8-536d588db2f5", "address": "fa:16:3e:49:be:8a", "network": {"id": "e3bc67bd-dd21-4701-b445-33eb52179602", "bridge": "br-int", "label": "tempest-network-smoke--1834937033", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap062b16e8-3c", "ovs_interfaceid": "062b16e8-3c3b-4520-b0f8-536d588db2f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.165 2 DEBUG oslo_concurrency.lockutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Releasing lock "refresh_cache-6af51230-93a7-45ef-9a1e-c47302f43bcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.166 2 DEBUG nova.compute.manager [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Instance network_info: |[{"id": "062b16e8-3c3b-4520-b0f8-536d588db2f5", "address": "fa:16:3e:49:be:8a", "network": {"id": "e3bc67bd-dd21-4701-b445-33eb52179602", "bridge": "br-int", "label": "tempest-network-smoke--1834937033", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap062b16e8-3c", "ovs_interfaceid": "062b16e8-3c3b-4520-b0f8-536d588db2f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Oct 08 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.167 2 DEBUG oslo_concurrency.lockutils [req-64a2a93f-5a7b-44ee-9876-16e5bef090f3 req-ebe6251b-bdb6-4253-adb9-fd5abb039c97 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-6af51230-93a7-45ef-9a1e-c47302f43bcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.168 2 DEBUG nova.network.neutron [req-64a2a93f-5a7b-44ee-9876-16e5bef090f3 req-ebe6251b-bdb6-4253-adb9-fd5abb039c97 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Refreshing network info cache for port 062b16e8-3c3b-4520-b0f8-536d588db2f5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 08 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.173 2 DEBUG nova.virt.libvirt.driver [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Start _get_guest_xml network_info=[{"id": "062b16e8-3c3b-4520-b0f8-536d588db2f5", "address": "fa:16:3e:49:be:8a", "network": {"id": "e3bc67bd-dd21-4701-b445-33eb52179602", "bridge": "br-int", "label": "tempest-network-smoke--1834937033", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap062b16e8-3c", "ovs_interfaceid": "062b16e8-3c3b-4520-b0f8-536d588db2f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T19:05:11Z,direct_url=<?>,disk_format='qcow2',id=23cfa426-7011-4566-992d-1c7af39f70dd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0776a2a010754884a7b224f3b08ef53b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T19:05:12Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'guest_format': None, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_options': None, 'image_id': '23cfa426-7011-4566-992d-1c7af39f70dd'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Oct 08 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.181 2 WARNING nova.virt.libvirt.driver [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 08 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.193 2 DEBUG nova.virt.libvirt.host [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Oct 08 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.194 2 DEBUG nova.virt.libvirt.host [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Oct 08 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.205 2 DEBUG nova.virt.libvirt.host [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Oct 08 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.206 2 DEBUG nova.virt.libvirt.host [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Oct 08 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.207 2 DEBUG nova.virt.libvirt.driver [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Oct 08 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.208 2 DEBUG nova.virt.hardware [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T19:05:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='e8a148fc-4419-4813-98ff-a17e2a95609e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T19:05:11Z,direct_url=<?>,disk_format='qcow2',id=23cfa426-7011-4566-992d-1c7af39f70dd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0776a2a010754884a7b224f3b08ef53b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T19:05:12Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Oct 08 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.208 2 DEBUG nova.virt.hardware [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Oct 08 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.209 2 DEBUG nova.virt.hardware [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Oct 08 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.209 2 DEBUG nova.virt.hardware [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Oct 08 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.210 2 DEBUG nova.virt.hardware [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Oct 08 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.210 2 DEBUG nova.virt.hardware [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Oct 08 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.211 2 DEBUG nova.virt.hardware [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Oct 08 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.211 2 DEBUG nova.virt.hardware [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Oct 08 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.212 2 DEBUG nova.virt.hardware [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Oct 08 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.213 2 DEBUG nova.virt.hardware [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Oct 08 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.213 2 DEBUG nova.virt.hardware [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Oct 08 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.220 2 DEBUG nova.virt.libvirt.vif [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T19:10:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1744393953',display_name='tempest-TestNetworkBasicOps-server-1744393953',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1744393953',id=7,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ/nGXxhsdJmWHabE3HFa5+3pmT1eGAFwd96u9XHC+whrqyLo5hIQAYJiUfXapQHjQsYnRIxe45Y0OXwPlQza5nnuSeUdl81Vlbahpy7snJ2RnOlPvASQfobelq2pqhHKA==',key_name='tempest-TestNetworkBasicOps-575443871',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-ke980fn8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T19:10:45Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=6af51230-93a7-45ef-9a1e-c47302f43bcf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "062b16e8-3c3b-4520-b0f8-536d588db2f5", "address": "fa:16:3e:49:be:8a", "network": {"id": "e3bc67bd-dd21-4701-b445-33eb52179602", "bridge": "br-int", "label": "tempest-network-smoke--1834937033", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap062b16e8-3c", "ovs_interfaceid": "062b16e8-3c3b-4520-b0f8-536d588db2f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Oct 08 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.221 2 DEBUG nova.network.os_vif_util [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "062b16e8-3c3b-4520-b0f8-536d588db2f5", "address": "fa:16:3e:49:be:8a", "network": {"id": "e3bc67bd-dd21-4701-b445-33eb52179602", "bridge": "br-int", "label": "tempest-network-smoke--1834937033", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap062b16e8-3c", "ovs_interfaceid": "062b16e8-3c3b-4520-b0f8-536d588db2f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 08 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.222 2 DEBUG nova.network.os_vif_util [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:49:be:8a,bridge_name='br-int',has_traffic_filtering=True,id=062b16e8-3c3b-4520-b0f8-536d588db2f5,network=Network(e3bc67bd-dd21-4701-b445-33eb52179602),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap062b16e8-3c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 08 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.226 2 DEBUG nova.objects.instance [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6af51230-93a7-45ef-9a1e-c47302f43bcf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 08 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.242 2 DEBUG nova.virt.libvirt.driver [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] End _get_guest_xml xml=<domain type="kvm">
Oct 08 19:10:52 compute-0 nova_compute[117514]:   <uuid>6af51230-93a7-45ef-9a1e-c47302f43bcf</uuid>
Oct 08 19:10:52 compute-0 nova_compute[117514]:   <name>instance-00000007</name>
Oct 08 19:10:52 compute-0 nova_compute[117514]:   <memory>131072</memory>
Oct 08 19:10:52 compute-0 nova_compute[117514]:   <vcpu>1</vcpu>
Oct 08 19:10:52 compute-0 nova_compute[117514]:   <metadata>
Oct 08 19:10:52 compute-0 nova_compute[117514]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 08 19:10:52 compute-0 nova_compute[117514]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct 08 19:10:52 compute-0 nova_compute[117514]:       <nova:name>tempest-TestNetworkBasicOps-server-1744393953</nova:name>
Oct 08 19:10:52 compute-0 nova_compute[117514]:       <nova:creationTime>2025-10-08 19:10:52</nova:creationTime>
Oct 08 19:10:52 compute-0 nova_compute[117514]:       <nova:flavor name="m1.nano">
Oct 08 19:10:52 compute-0 nova_compute[117514]:         <nova:memory>128</nova:memory>
Oct 08 19:10:52 compute-0 nova_compute[117514]:         <nova:disk>1</nova:disk>
Oct 08 19:10:52 compute-0 nova_compute[117514]:         <nova:swap>0</nova:swap>
Oct 08 19:10:52 compute-0 nova_compute[117514]:         <nova:ephemeral>0</nova:ephemeral>
Oct 08 19:10:52 compute-0 nova_compute[117514]:         <nova:vcpus>1</nova:vcpus>
Oct 08 19:10:52 compute-0 nova_compute[117514]:       </nova:flavor>
Oct 08 19:10:52 compute-0 nova_compute[117514]:       <nova:owner>
Oct 08 19:10:52 compute-0 nova_compute[117514]:         <nova:user uuid="efdb1424acdb478684cdb088b373ba05">tempest-TestNetworkBasicOps-1122149477-project-member</nova:user>
Oct 08 19:10:52 compute-0 nova_compute[117514]:         <nova:project uuid="b7f7c752a9c5498f8eda73e461895ac9">tempest-TestNetworkBasicOps-1122149477</nova:project>
Oct 08 19:10:52 compute-0 nova_compute[117514]:       </nova:owner>
Oct 08 19:10:52 compute-0 nova_compute[117514]:       <nova:root type="image" uuid="23cfa426-7011-4566-992d-1c7af39f70dd"/>
Oct 08 19:10:52 compute-0 nova_compute[117514]:       <nova:ports>
Oct 08 19:10:52 compute-0 nova_compute[117514]:         <nova:port uuid="062b16e8-3c3b-4520-b0f8-536d588db2f5">
Oct 08 19:10:52 compute-0 nova_compute[117514]:           <nova:ip type="fixed" address="10.100.0.24" ipVersion="4"/>
Oct 08 19:10:52 compute-0 nova_compute[117514]:         </nova:port>
Oct 08 19:10:52 compute-0 nova_compute[117514]:       </nova:ports>
Oct 08 19:10:52 compute-0 nova_compute[117514]:     </nova:instance>
Oct 08 19:10:52 compute-0 nova_compute[117514]:   </metadata>
Oct 08 19:10:52 compute-0 nova_compute[117514]:   <sysinfo type="smbios">
Oct 08 19:10:52 compute-0 nova_compute[117514]:     <system>
Oct 08 19:10:52 compute-0 nova_compute[117514]:       <entry name="manufacturer">RDO</entry>
Oct 08 19:10:52 compute-0 nova_compute[117514]:       <entry name="product">OpenStack Compute</entry>
Oct 08 19:10:52 compute-0 nova_compute[117514]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct 08 19:10:52 compute-0 nova_compute[117514]:       <entry name="serial">6af51230-93a7-45ef-9a1e-c47302f43bcf</entry>
Oct 08 19:10:52 compute-0 nova_compute[117514]:       <entry name="uuid">6af51230-93a7-45ef-9a1e-c47302f43bcf</entry>
Oct 08 19:10:52 compute-0 nova_compute[117514]:       <entry name="family">Virtual Machine</entry>
Oct 08 19:10:52 compute-0 nova_compute[117514]:     </system>
Oct 08 19:10:52 compute-0 nova_compute[117514]:   </sysinfo>
Oct 08 19:10:52 compute-0 nova_compute[117514]:   <os>
Oct 08 19:10:52 compute-0 nova_compute[117514]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 08 19:10:52 compute-0 nova_compute[117514]:     <boot dev="hd"/>
Oct 08 19:10:52 compute-0 nova_compute[117514]:     <smbios mode="sysinfo"/>
Oct 08 19:10:52 compute-0 nova_compute[117514]:   </os>
Oct 08 19:10:52 compute-0 nova_compute[117514]:   <features>
Oct 08 19:10:52 compute-0 nova_compute[117514]:     <acpi/>
Oct 08 19:10:52 compute-0 nova_compute[117514]:     <apic/>
Oct 08 19:10:52 compute-0 nova_compute[117514]:     <vmcoreinfo/>
Oct 08 19:10:52 compute-0 nova_compute[117514]:   </features>
Oct 08 19:10:52 compute-0 nova_compute[117514]:   <clock offset="utc">
Oct 08 19:10:52 compute-0 nova_compute[117514]:     <timer name="pit" tickpolicy="delay"/>
Oct 08 19:10:52 compute-0 nova_compute[117514]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 08 19:10:52 compute-0 nova_compute[117514]:     <timer name="hpet" present="no"/>
Oct 08 19:10:52 compute-0 nova_compute[117514]:   </clock>
Oct 08 19:10:52 compute-0 nova_compute[117514]:   <cpu mode="host-model" match="exact">
Oct 08 19:10:52 compute-0 nova_compute[117514]:     <topology sockets="1" cores="1" threads="1"/>
Oct 08 19:10:52 compute-0 nova_compute[117514]:   </cpu>
Oct 08 19:10:52 compute-0 nova_compute[117514]:   <devices>
Oct 08 19:10:52 compute-0 nova_compute[117514]:     <disk type="file" device="disk">
Oct 08 19:10:52 compute-0 nova_compute[117514]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 08 19:10:52 compute-0 nova_compute[117514]:       <source file="/var/lib/nova/instances/6af51230-93a7-45ef-9a1e-c47302f43bcf/disk"/>
Oct 08 19:10:52 compute-0 nova_compute[117514]:       <target dev="vda" bus="virtio"/>
Oct 08 19:10:52 compute-0 nova_compute[117514]:     </disk>
Oct 08 19:10:52 compute-0 nova_compute[117514]:     <disk type="file" device="cdrom">
Oct 08 19:10:52 compute-0 nova_compute[117514]:       <driver name="qemu" type="raw" cache="none"/>
Oct 08 19:10:52 compute-0 nova_compute[117514]:       <source file="/var/lib/nova/instances/6af51230-93a7-45ef-9a1e-c47302f43bcf/disk.config"/>
Oct 08 19:10:52 compute-0 nova_compute[117514]:       <target dev="sda" bus="sata"/>
Oct 08 19:10:52 compute-0 nova_compute[117514]:     </disk>
Oct 08 19:10:52 compute-0 nova_compute[117514]:     <interface type="ethernet">
Oct 08 19:10:52 compute-0 nova_compute[117514]:       <mac address="fa:16:3e:49:be:8a"/>
Oct 08 19:10:52 compute-0 nova_compute[117514]:       <model type="virtio"/>
Oct 08 19:10:52 compute-0 nova_compute[117514]:       <driver name="vhost" rx_queue_size="512"/>
Oct 08 19:10:52 compute-0 nova_compute[117514]:       <mtu size="1442"/>
Oct 08 19:10:52 compute-0 nova_compute[117514]:       <target dev="tap062b16e8-3c"/>
Oct 08 19:10:52 compute-0 nova_compute[117514]:     </interface>
Oct 08 19:10:52 compute-0 nova_compute[117514]:     <serial type="pty">
Oct 08 19:10:52 compute-0 nova_compute[117514]:       <log file="/var/lib/nova/instances/6af51230-93a7-45ef-9a1e-c47302f43bcf/console.log" append="off"/>
Oct 08 19:10:52 compute-0 nova_compute[117514]:     </serial>
Oct 08 19:10:52 compute-0 nova_compute[117514]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 08 19:10:52 compute-0 nova_compute[117514]:     <video>
Oct 08 19:10:52 compute-0 nova_compute[117514]:       <model type="virtio"/>
Oct 08 19:10:52 compute-0 nova_compute[117514]:     </video>
Oct 08 19:10:52 compute-0 nova_compute[117514]:     <input type="tablet" bus="usb"/>
Oct 08 19:10:52 compute-0 nova_compute[117514]:     <rng model="virtio">
Oct 08 19:10:52 compute-0 nova_compute[117514]:       <backend model="random">/dev/urandom</backend>
Oct 08 19:10:52 compute-0 nova_compute[117514]:     </rng>
Oct 08 19:10:52 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root"/>
Oct 08 19:10:52 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:10:52 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:10:52 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:10:52 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:10:52 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:10:52 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:10:52 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:10:52 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:10:52 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:10:52 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:10:52 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:10:52 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:10:52 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:10:52 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:10:52 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:10:52 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:10:52 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:10:52 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:10:52 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:10:52 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:10:52 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:10:52 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:10:52 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:10:52 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:10:52 compute-0 nova_compute[117514]:     <controller type="usb" index="0"/>
Oct 08 19:10:52 compute-0 nova_compute[117514]:     <memballoon model="virtio">
Oct 08 19:10:52 compute-0 nova_compute[117514]:       <stats period="10"/>
Oct 08 19:10:52 compute-0 nova_compute[117514]:     </memballoon>
Oct 08 19:10:52 compute-0 nova_compute[117514]:   </devices>
Oct 08 19:10:52 compute-0 nova_compute[117514]: </domain>
Oct 08 19:10:52 compute-0 nova_compute[117514]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Oct 08 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.244 2 DEBUG nova.compute.manager [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Preparing to wait for external event network-vif-plugged-062b16e8-3c3b-4520-b0f8-536d588db2f5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Oct 08 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.244 2 DEBUG oslo_concurrency.lockutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "6af51230-93a7-45ef-9a1e-c47302f43bcf-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.245 2 DEBUG oslo_concurrency.lockutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "6af51230-93a7-45ef-9a1e-c47302f43bcf-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.246 2 DEBUG oslo_concurrency.lockutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "6af51230-93a7-45ef-9a1e-c47302f43bcf-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.247 2 DEBUG nova.virt.libvirt.vif [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T19:10:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1744393953',display_name='tempest-TestNetworkBasicOps-server-1744393953',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1744393953',id=7,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ/nGXxhsdJmWHabE3HFa5+3pmT1eGAFwd96u9XHC+whrqyLo5hIQAYJiUfXapQHjQsYnRIxe45Y0OXwPlQza5nnuSeUdl81Vlbahpy7snJ2RnOlPvASQfobelq2pqhHKA==',key_name='tempest-TestNetworkBasicOps-575443871',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-ke980fn8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T19:10:45Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=6af51230-93a7-45ef-9a1e-c47302f43bcf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "062b16e8-3c3b-4520-b0f8-536d588db2f5", "address": "fa:16:3e:49:be:8a", "network": {"id": "e3bc67bd-dd21-4701-b445-33eb52179602", "bridge": "br-int", "label": "tempest-network-smoke--1834937033", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap062b16e8-3c", "ovs_interfaceid": "062b16e8-3c3b-4520-b0f8-536d588db2f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Oct 08 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.248 2 DEBUG nova.network.os_vif_util [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "062b16e8-3c3b-4520-b0f8-536d588db2f5", "address": "fa:16:3e:49:be:8a", "network": {"id": "e3bc67bd-dd21-4701-b445-33eb52179602", "bridge": "br-int", "label": "tempest-network-smoke--1834937033", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap062b16e8-3c", "ovs_interfaceid": "062b16e8-3c3b-4520-b0f8-536d588db2f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 08 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.249 2 DEBUG nova.network.os_vif_util [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:49:be:8a,bridge_name='br-int',has_traffic_filtering=True,id=062b16e8-3c3b-4520-b0f8-536d588db2f5,network=Network(e3bc67bd-dd21-4701-b445-33eb52179602),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap062b16e8-3c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 08 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.250 2 DEBUG os_vif [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:49:be:8a,bridge_name='br-int',has_traffic_filtering=True,id=062b16e8-3c3b-4520-b0f8-536d588db2f5,network=Network(e3bc67bd-dd21-4701-b445-33eb52179602),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap062b16e8-3c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Oct 08 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.252 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.253 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 08 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.258 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap062b16e8-3c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.259 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap062b16e8-3c, col_values=(('external_ids', {'iface-id': '062b16e8-3c3b-4520-b0f8-536d588db2f5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:49:be:8a', 'vm-uuid': '6af51230-93a7-45ef-9a1e-c47302f43bcf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:10:52 compute-0 NetworkManager[1035]: <info>  [1759950652.2632] manager: (tap062b16e8-3c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/61)
Oct 08 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.261 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 08 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.272 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.273 2 INFO os_vif [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:49:be:8a,bridge_name='br-int',has_traffic_filtering=True,id=062b16e8-3c3b-4520-b0f8-536d588db2f5,network=Network(e3bc67bd-dd21-4701-b445-33eb52179602),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap062b16e8-3c')
Oct 08 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.322 2 DEBUG nova.virt.libvirt.driver [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 08 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.323 2 DEBUG nova.virt.libvirt.driver [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 08 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.324 2 DEBUG nova.virt.libvirt.driver [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No VIF found with MAC fa:16:3e:49:be:8a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Oct 08 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.325 2 INFO nova.virt.libvirt.driver [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Using config drive
Oct 08 19:10:52 compute-0 podman[147858]: 2025-10-08 19:10:52.64848277 +0000 UTC m=+0.074502207 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_id=edpm, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2)
Oct 08 19:10:53 compute-0 nova_compute[117514]: 2025-10-08 19:10:53.254 2 INFO nova.virt.libvirt.driver [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Creating config drive at /var/lib/nova/instances/6af51230-93a7-45ef-9a1e-c47302f43bcf/disk.config
Oct 08 19:10:53 compute-0 nova_compute[117514]: 2025-10-08 19:10:53.259 2 DEBUG oslo_concurrency.processutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6af51230-93a7-45ef-9a1e-c47302f43bcf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpv_e8qxmf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:10:53 compute-0 nova_compute[117514]: 2025-10-08 19:10:53.395 2 DEBUG oslo_concurrency.processutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6af51230-93a7-45ef-9a1e-c47302f43bcf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpv_e8qxmf" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:10:53 compute-0 NetworkManager[1035]: <info>  [1759950653.4797] manager: (tap062b16e8-3c): new Tun device (/org/freedesktop/NetworkManager/Devices/62)
Oct 08 19:10:53 compute-0 kernel: tap062b16e8-3c: entered promiscuous mode
Oct 08 19:10:53 compute-0 ovn_controller[19759]: 2025-10-08T19:10:53Z|00094|binding|INFO|Claiming lport 062b16e8-3c3b-4520-b0f8-536d588db2f5 for this chassis.
Oct 08 19:10:53 compute-0 ovn_controller[19759]: 2025-10-08T19:10:53Z|00095|binding|INFO|062b16e8-3c3b-4520-b0f8-536d588db2f5: Claiming fa:16:3e:49:be:8a 10.100.0.24
Oct 08 19:10:53 compute-0 nova_compute[117514]: 2025-10-08 19:10:53.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:10:53 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:53.497 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:49:be:8a 10.100.0.24'], port_security=['fa:16:3e:49:be:8a 10.100.0.24'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.24/28', 'neutron:device_id': '6af51230-93a7-45ef-9a1e-c47302f43bcf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e3bc67bd-dd21-4701-b445-33eb52179602', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a7ebd8cf-2e32-494a-bac7-d2c7c2ffc36a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=325bd26c-56bb-4683-8b62-92cc8f266207, chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=062b16e8-3c3b-4520-b0f8-536d588db2f5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 08 19:10:53 compute-0 ovn_controller[19759]: 2025-10-08T19:10:53Z|00096|binding|INFO|Setting lport 062b16e8-3c3b-4520-b0f8-536d588db2f5 up in Southbound
Oct 08 19:10:53 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:53.500 28643 INFO neutron.agent.ovn.metadata.agent [-] Port 062b16e8-3c3b-4520-b0f8-536d588db2f5 in datapath e3bc67bd-dd21-4701-b445-33eb52179602 bound to our chassis
Oct 08 19:10:53 compute-0 ovn_controller[19759]: 2025-10-08T19:10:53Z|00097|binding|INFO|Setting lport 062b16e8-3c3b-4520-b0f8-536d588db2f5 ovn-installed in OVS
Oct 08 19:10:53 compute-0 nova_compute[117514]: 2025-10-08 19:10:53.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:10:53 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:53.504 28643 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e3bc67bd-dd21-4701-b445-33eb52179602
Oct 08 19:10:53 compute-0 nova_compute[117514]: 2025-10-08 19:10:53.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:10:53 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:53.526 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[9707f6b2-8c19-4bb4-941c-815b93edbb1b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:10:53 compute-0 systemd-udevd[147898]: Network interface NamePolicy= disabled on kernel command line.
Oct 08 19:10:53 compute-0 systemd-machined[77568]: New machine qemu-7-instance-00000007.
Oct 08 19:10:53 compute-0 NetworkManager[1035]: <info>  [1759950653.5542] device (tap062b16e8-3c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 08 19:10:53 compute-0 NetworkManager[1035]: <info>  [1759950653.5553] device (tap062b16e8-3c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 08 19:10:53 compute-0 systemd[1]: Started Virtual Machine qemu-7-instance-00000007.
Oct 08 19:10:53 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:53.561 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[2bc8f78e-60a9-408b-af3f-750664ad33f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:10:53 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:53.565 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[8f715c55-3505-4576-b1ed-4ac8256d0186]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:10:53 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:53.602 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[6b1d07ed-a766-44b8-8f52-63fd41fc9346]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:10:53 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:53.631 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[e8c5d870-8fc2-4993-ad2a-62c4292d398b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape3bc67bd-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5d:4f:d1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 130101, 'reachable_time': 29029, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 147907, 'error': None, 'target': 'ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:10:53 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:53.657 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[cec0ecea-ed82-4475-824f-28a08b1b52c6]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tape3bc67bd-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 130118, 'tstamp': 130118}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 147910, 'error': None, 'target': 'ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.17'], ['IFA_LOCAL', '10.100.0.17'], ['IFA_BROADCAST', '10.100.0.31'], ['IFA_LABEL', 'tape3bc67bd-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 130123, 'tstamp': 130123}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 147910, 'error': None, 'target': 'ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:10:53 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:53.659 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape3bc67bd-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:10:53 compute-0 nova_compute[117514]: 2025-10-08 19:10:53.661 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:10:53 compute-0 nova_compute[117514]: 2025-10-08 19:10:53.663 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:10:53 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:53.663 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape3bc67bd-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:10:53 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:53.663 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 08 19:10:53 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:53.664 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape3bc67bd-d0, col_values=(('external_ids', {'iface-id': 'd935682a-e42a-4970-b54c-b54c616cf798'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:10:53 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:53.664 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 08 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.301 2 DEBUG nova.compute.manager [req-57d1e0ce-bece-4a4c-a337-ce24b22dac81 req-b4b95a35-2872-4e05-b4ec-5278ea11752c bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Received event network-vif-plugged-062b16e8-3c3b-4520-b0f8-536d588db2f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.302 2 DEBUG oslo_concurrency.lockutils [req-57d1e0ce-bece-4a4c-a337-ce24b22dac81 req-b4b95a35-2872-4e05-b4ec-5278ea11752c bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "6af51230-93a7-45ef-9a1e-c47302f43bcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.302 2 DEBUG oslo_concurrency.lockutils [req-57d1e0ce-bece-4a4c-a337-ce24b22dac81 req-b4b95a35-2872-4e05-b4ec-5278ea11752c bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "6af51230-93a7-45ef-9a1e-c47302f43bcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.303 2 DEBUG oslo_concurrency.lockutils [req-57d1e0ce-bece-4a4c-a337-ce24b22dac81 req-b4b95a35-2872-4e05-b4ec-5278ea11752c bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "6af51230-93a7-45ef-9a1e-c47302f43bcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.303 2 DEBUG nova.compute.manager [req-57d1e0ce-bece-4a4c-a337-ce24b22dac81 req-b4b95a35-2872-4e05-b4ec-5278ea11752c bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Processing event network-vif-plugged-062b16e8-3c3b-4520-b0f8-536d588db2f5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Oct 08 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.372 2 DEBUG nova.network.neutron [req-64a2a93f-5a7b-44ee-9876-16e5bef090f3 req-ebe6251b-bdb6-4253-adb9-fd5abb039c97 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Updated VIF entry in instance network info cache for port 062b16e8-3c3b-4520-b0f8-536d588db2f5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 08 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.373 2 DEBUG nova.network.neutron [req-64a2a93f-5a7b-44ee-9876-16e5bef090f3 req-ebe6251b-bdb6-4253-adb9-fd5abb039c97 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Updating instance_info_cache with network_info: [{"id": "062b16e8-3c3b-4520-b0f8-536d588db2f5", "address": "fa:16:3e:49:be:8a", "network": {"id": "e3bc67bd-dd21-4701-b445-33eb52179602", "bridge": "br-int", "label": "tempest-network-smoke--1834937033", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap062b16e8-3c", "ovs_interfaceid": "062b16e8-3c3b-4520-b0f8-536d588db2f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.391 2 DEBUG oslo_concurrency.lockutils [req-64a2a93f-5a7b-44ee-9876-16e5bef090f3 req-ebe6251b-bdb6-4253-adb9-fd5abb039c97 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-6af51230-93a7-45ef-9a1e-c47302f43bcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.849 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950654.848302, 6af51230-93a7-45ef-9a1e-c47302f43bcf => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 08 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.849 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] VM Started (Lifecycle Event)
Oct 08 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.853 2 DEBUG nova.compute.manager [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Oct 08 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.856 2 DEBUG nova.virt.libvirt.driver [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Oct 08 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.860 2 INFO nova.virt.libvirt.driver [-] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Instance spawned successfully.
Oct 08 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.861 2 DEBUG nova.virt.libvirt.driver [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Oct 08 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.882 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.893 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 08 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.901 2 DEBUG nova.virt.libvirt.driver [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.902 2 DEBUG nova.virt.libvirt.driver [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.903 2 DEBUG nova.virt.libvirt.driver [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.904 2 DEBUG nova.virt.libvirt.driver [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.905 2 DEBUG nova.virt.libvirt.driver [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.906 2 DEBUG nova.virt.libvirt.driver [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.942 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 08 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.943 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950654.8484669, 6af51230-93a7-45ef-9a1e-c47302f43bcf => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 08 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.943 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] VM Paused (Lifecycle Event)
Oct 08 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.975 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.980 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950654.8558252, 6af51230-93a7-45ef-9a1e-c47302f43bcf => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 08 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.981 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] VM Resumed (Lifecycle Event)
Oct 08 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.989 2 INFO nova.compute.manager [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Took 9.86 seconds to spawn the instance on the hypervisor.
Oct 08 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.989 2 DEBUG nova.compute.manager [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:10:55 compute-0 nova_compute[117514]: 2025-10-08 19:10:55.002 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:10:55 compute-0 nova_compute[117514]: 2025-10-08 19:10:55.007 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 08 19:10:55 compute-0 nova_compute[117514]: 2025-10-08 19:10:55.043 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 08 19:10:55 compute-0 nova_compute[117514]: 2025-10-08 19:10:55.073 2 INFO nova.compute.manager [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Took 10.38 seconds to build instance.
Oct 08 19:10:55 compute-0 nova_compute[117514]: 2025-10-08 19:10:55.091 2 DEBUG oslo_concurrency.lockutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "6af51230-93a7-45ef-9a1e-c47302f43bcf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.466s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:10:55 compute-0 nova_compute[117514]: 2025-10-08 19:10:55.975 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:10:56 compute-0 nova_compute[117514]: 2025-10-08 19:10:56.389 2 DEBUG nova.compute.manager [req-a63cbb7d-410e-48eb-8159-f18f9d9a1947 req-82204da9-6e7f-4ddd-8e5a-203d270a81a1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Received event network-vif-plugged-062b16e8-3c3b-4520-b0f8-536d588db2f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:10:56 compute-0 nova_compute[117514]: 2025-10-08 19:10:56.390 2 DEBUG oslo_concurrency.lockutils [req-a63cbb7d-410e-48eb-8159-f18f9d9a1947 req-82204da9-6e7f-4ddd-8e5a-203d270a81a1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "6af51230-93a7-45ef-9a1e-c47302f43bcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:10:56 compute-0 nova_compute[117514]: 2025-10-08 19:10:56.391 2 DEBUG oslo_concurrency.lockutils [req-a63cbb7d-410e-48eb-8159-f18f9d9a1947 req-82204da9-6e7f-4ddd-8e5a-203d270a81a1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "6af51230-93a7-45ef-9a1e-c47302f43bcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:10:56 compute-0 nova_compute[117514]: 2025-10-08 19:10:56.392 2 DEBUG oslo_concurrency.lockutils [req-a63cbb7d-410e-48eb-8159-f18f9d9a1947 req-82204da9-6e7f-4ddd-8e5a-203d270a81a1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "6af51230-93a7-45ef-9a1e-c47302f43bcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:10:56 compute-0 nova_compute[117514]: 2025-10-08 19:10:56.392 2 DEBUG nova.compute.manager [req-a63cbb7d-410e-48eb-8159-f18f9d9a1947 req-82204da9-6e7f-4ddd-8e5a-203d270a81a1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] No waiting events found dispatching network-vif-plugged-062b16e8-3c3b-4520-b0f8-536d588db2f5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 08 19:10:56 compute-0 nova_compute[117514]: 2025-10-08 19:10:56.393 2 WARNING nova.compute.manager [req-a63cbb7d-410e-48eb-8159-f18f9d9a1947 req-82204da9-6e7f-4ddd-8e5a-203d270a81a1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Received unexpected event network-vif-plugged-062b16e8-3c3b-4520-b0f8-536d588db2f5 for instance with vm_state active and task_state None.
Oct 08 19:10:57 compute-0 nova_compute[117514]: 2025-10-08 19:10:57.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:10:58 compute-0 podman[147920]: 2025-10-08 19:10:58.640654616 +0000 UTC m=+0.057319670 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 08 19:10:58 compute-0 podman[147919]: 2025-10-08 19:10:58.667971687 +0000 UTC m=+0.080591444 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.buildah.version=1.33.7, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, managed_by=edpm_ansible, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, release=1755695350, architecture=x86_64, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct 08 19:11:00 compute-0 nova_compute[117514]: 2025-10-08 19:11:00.979 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:11:01 compute-0 podman[147959]: 2025-10-08 19:11:01.663488804 +0000 UTC m=+0.078641677 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct 08 19:11:02 compute-0 nova_compute[117514]: 2025-10-08 19:11:02.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:11:03 compute-0 nova_compute[117514]: 2025-10-08 19:11:03.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:11:04 compute-0 podman[147993]: 2025-10-08 19:11:04.68595602 +0000 UTC m=+0.091934061 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_managed=true)
Oct 08 19:11:04 compute-0 podman[147996]: 2025-10-08 19:11:04.705425074 +0000 UTC m=+0.100214961 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 08 19:11:04 compute-0 podman[147995]: 2025-10-08 19:11:04.716515905 +0000 UTC m=+0.123270869 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251001, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Oct 08 19:11:05 compute-0 ovn_controller[19759]: 2025-10-08T19:11:05Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:49:be:8a 10.100.0.24
Oct 08 19:11:05 compute-0 ovn_controller[19759]: 2025-10-08T19:11:05Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:49:be:8a 10.100.0.24
Oct 08 19:11:05 compute-0 nova_compute[117514]: 2025-10-08 19:11:05.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:11:06 compute-0 nova_compute[117514]: 2025-10-08 19:11:06.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:11:06 compute-0 nova_compute[117514]: 2025-10-08 19:11:06.719 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Oct 08 19:11:07 compute-0 nova_compute[117514]: 2025-10-08 19:11:07.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:11:07 compute-0 nova_compute[117514]: 2025-10-08 19:11:07.732 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:11:07 compute-0 nova_compute[117514]: 2025-10-08 19:11:07.732 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:11:07 compute-0 nova_compute[117514]: 2025-10-08 19:11:07.733 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:11:07 compute-0 nova_compute[117514]: 2025-10-08 19:11:07.759 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:11:07 compute-0 nova_compute[117514]: 2025-10-08 19:11:07.759 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:11:07 compute-0 nova_compute[117514]: 2025-10-08 19:11:07.760 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:11:07 compute-0 nova_compute[117514]: 2025-10-08 19:11:07.760 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 08 19:11:07 compute-0 nova_compute[117514]: 2025-10-08 19:11:07.845 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6af51230-93a7-45ef-9a1e-c47302f43bcf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:11:07 compute-0 nova_compute[117514]: 2025-10-08 19:11:07.935 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6af51230-93a7-45ef-9a1e-c47302f43bcf/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:11:07 compute-0 nova_compute[117514]: 2025-10-08 19:11:07.936 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6af51230-93a7-45ef-9a1e-c47302f43bcf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:11:08 compute-0 nova_compute[117514]: 2025-10-08 19:11:08.029 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6af51230-93a7-45ef-9a1e-c47302f43bcf/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:11:08 compute-0 nova_compute[117514]: 2025-10-08 19:11:08.039 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:11:08 compute-0 nova_compute[117514]: 2025-10-08 19:11:08.132 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:11:08 compute-0 nova_compute[117514]: 2025-10-08 19:11:08.134 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:11:08 compute-0 nova_compute[117514]: 2025-10-08 19:11:08.207 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:11:08 compute-0 nova_compute[117514]: 2025-10-08 19:11:08.469 2 WARNING nova.virt.libvirt.driver [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 08 19:11:08 compute-0 nova_compute[117514]: 2025-10-08 19:11:08.470 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5758MB free_disk=73.35517883300781GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 08 19:11:08 compute-0 nova_compute[117514]: 2025-10-08 19:11:08.471 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:11:08 compute-0 nova_compute[117514]: 2025-10-08 19:11:08.471 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:11:08 compute-0 nova_compute[117514]: 2025-10-08 19:11:08.684 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Instance 783f8889-2bc8-4641-bdb9-95ee4226a2fd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 08 19:11:08 compute-0 nova_compute[117514]: 2025-10-08 19:11:08.685 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Instance 6af51230-93a7-45ef-9a1e-c47302f43bcf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 08 19:11:08 compute-0 nova_compute[117514]: 2025-10-08 19:11:08.686 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 08 19:11:08 compute-0 nova_compute[117514]: 2025-10-08 19:11:08.686 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 08 19:11:08 compute-0 nova_compute[117514]: 2025-10-08 19:11:08.737 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Refreshing inventories for resource provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Oct 08 19:11:08 compute-0 nova_compute[117514]: 2025-10-08 19:11:08.808 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Updating ProviderTree inventory for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Oct 08 19:11:08 compute-0 nova_compute[117514]: 2025-10-08 19:11:08.809 2 DEBUG nova.compute.provider_tree [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Updating inventory in ProviderTree for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Oct 08 19:11:08 compute-0 nova_compute[117514]: 2025-10-08 19:11:08.824 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Refreshing aggregate associations for resource provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Oct 08 19:11:08 compute-0 nova_compute[117514]: 2025-10-08 19:11:08.848 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Refreshing trait associations for resource provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349, traits: COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_FDC,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE2,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_MMX,HW_CPU_X86_AESNI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_ACCELERATORS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE42,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_AVX,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_FMA3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SVM,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NODE,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSSE3,HW_CPU_X86_AVX2,HW_CPU_X86_SSE,HW_CPU_X86_CLMUL,HW_CPU_X86_F16C,HW_CPU_X86_AMD_SVM,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_BMI,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_PCNET _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Oct 08 19:11:08 compute-0 nova_compute[117514]: 2025-10-08 19:11:08.908 2 DEBUG nova.compute.provider_tree [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 08 19:11:08 compute-0 nova_compute[117514]: 2025-10-08 19:11:08.924 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 08 19:11:08 compute-0 nova_compute[117514]: 2025-10-08 19:11:08.956 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 08 19:11:08 compute-0 nova_compute[117514]: 2025-10-08 19:11:08.956 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.485s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:11:09 compute-0 nova_compute[117514]: 2025-10-08 19:11:09.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:11:09 compute-0 nova_compute[117514]: 2025-10-08 19:11:09.718 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:11:09 compute-0 nova_compute[117514]: 2025-10-08 19:11:09.741 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:11:09 compute-0 nova_compute[117514]: 2025-10-08 19:11:09.741 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 08 19:11:09 compute-0 nova_compute[117514]: 2025-10-08 19:11:09.742 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 08 19:11:10 compute-0 nova_compute[117514]: 2025-10-08 19:11:10.111 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "refresh_cache-783f8889-2bc8-4641-bdb9-95ee4226a2fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:11:10 compute-0 nova_compute[117514]: 2025-10-08 19:11:10.112 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquired lock "refresh_cache-783f8889-2bc8-4641-bdb9-95ee4226a2fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:11:10 compute-0 nova_compute[117514]: 2025-10-08 19:11:10.113 2 DEBUG nova.network.neutron [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Oct 08 19:11:10 compute-0 nova_compute[117514]: 2025-10-08 19:11:10.113 2 DEBUG nova.objects.instance [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 783f8889-2bc8-4641-bdb9-95ee4226a2fd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 08 19:11:10 compute-0 nova_compute[117514]: 2025-10-08 19:11:10.983 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:11:11 compute-0 nova_compute[117514]: 2025-10-08 19:11:11.643 2 DEBUG nova.network.neutron [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Updating instance_info_cache with network_info: [{"id": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "address": "fa:16:3e:4e:85:2e", "network": {"id": "0d073e98-c9f2-4b90-8237-84ff2fa99090", "bridge": "br-int", "label": "tempest-network-smoke--1785011615", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb32e9e-52", "ovs_interfaceid": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ea81e5cb-74ba-43da-a780-3f1f699fa0d6", "address": "fa:16:3e:11:ac:ba", "network": {"id": "e3bc67bd-dd21-4701-b445-33eb52179602", "bridge": "br-int", "label": "tempest-network-smoke--1834937033", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea81e5cb-74", "ovs_interfaceid": "ea81e5cb-74ba-43da-a780-3f1f699fa0d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:11:11 compute-0 nova_compute[117514]: 2025-10-08 19:11:11.666 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Releasing lock "refresh_cache-783f8889-2bc8-4641-bdb9-95ee4226a2fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:11:11 compute-0 nova_compute[117514]: 2025-10-08 19:11:11.666 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Oct 08 19:11:11 compute-0 nova_compute[117514]: 2025-10-08 19:11:11.667 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:11:11 compute-0 nova_compute[117514]: 2025-10-08 19:11:11.668 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:11:11 compute-0 nova_compute[117514]: 2025-10-08 19:11:11.669 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:11:11 compute-0 nova_compute[117514]: 2025-10-08 19:11:11.669 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 08 19:11:11 compute-0 nova_compute[117514]: 2025-10-08 19:11:11.670 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:11:12 compute-0 nova_compute[117514]: 2025-10-08 19:11:12.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:11:12 compute-0 nova_compute[117514]: 2025-10-08 19:11:12.728 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:11:12 compute-0 nova_compute[117514]: 2025-10-08 19:11:12.729 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Oct 08 19:11:12 compute-0 nova_compute[117514]: 2025-10-08 19:11:12.750 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Oct 08 19:11:13 compute-0 nova_compute[117514]: 2025-10-08 19:11:13.515 2 DEBUG nova.compute.manager [req-170bf41f-a65f-4b98-b4db-2f2e975c59bf req-6cc8365b-dc88-4f15-b73c-4cd5c87feb71 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Received event network-changed-ea81e5cb-74ba-43da-a780-3f1f699fa0d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:11:13 compute-0 nova_compute[117514]: 2025-10-08 19:11:13.516 2 DEBUG nova.compute.manager [req-170bf41f-a65f-4b98-b4db-2f2e975c59bf req-6cc8365b-dc88-4f15-b73c-4cd5c87feb71 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Refreshing instance network info cache due to event network-changed-ea81e5cb-74ba-43da-a780-3f1f699fa0d6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 08 19:11:13 compute-0 nova_compute[117514]: 2025-10-08 19:11:13.516 2 DEBUG oslo_concurrency.lockutils [req-170bf41f-a65f-4b98-b4db-2f2e975c59bf req-6cc8365b-dc88-4f15-b73c-4cd5c87feb71 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-783f8889-2bc8-4641-bdb9-95ee4226a2fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:11:13 compute-0 nova_compute[117514]: 2025-10-08 19:11:13.517 2 DEBUG oslo_concurrency.lockutils [req-170bf41f-a65f-4b98-b4db-2f2e975c59bf req-6cc8365b-dc88-4f15-b73c-4cd5c87feb71 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-783f8889-2bc8-4641-bdb9-95ee4226a2fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:11:13 compute-0 nova_compute[117514]: 2025-10-08 19:11:13.517 2 DEBUG nova.network.neutron [req-170bf41f-a65f-4b98-b4db-2f2e975c59bf req-6cc8365b-dc88-4f15-b73c-4cd5c87feb71 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Refreshing network info cache for port ea81e5cb-74ba-43da-a780-3f1f699fa0d6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 08 19:11:13 compute-0 podman[148072]: 2025-10-08 19:11:13.666324393 +0000 UTC m=+0.076061272 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 08 19:11:14 compute-0 nova_compute[117514]: 2025-10-08 19:11:14.362 2 DEBUG nova.network.neutron [req-170bf41f-a65f-4b98-b4db-2f2e975c59bf req-6cc8365b-dc88-4f15-b73c-4cd5c87feb71 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Updated VIF entry in instance network info cache for port ea81e5cb-74ba-43da-a780-3f1f699fa0d6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 08 19:11:14 compute-0 nova_compute[117514]: 2025-10-08 19:11:14.363 2 DEBUG nova.network.neutron [req-170bf41f-a65f-4b98-b4db-2f2e975c59bf req-6cc8365b-dc88-4f15-b73c-4cd5c87feb71 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Updating instance_info_cache with network_info: [{"id": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "address": "fa:16:3e:4e:85:2e", "network": {"id": "0d073e98-c9f2-4b90-8237-84ff2fa99090", "bridge": "br-int", "label": "tempest-network-smoke--1785011615", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb32e9e-52", "ovs_interfaceid": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ea81e5cb-74ba-43da-a780-3f1f699fa0d6", "address": "fa:16:3e:11:ac:ba", "network": {"id": "e3bc67bd-dd21-4701-b445-33eb52179602", "bridge": "br-int", "label": "tempest-network-smoke--1834937033", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea81e5cb-74", "ovs_interfaceid": "ea81e5cb-74ba-43da-a780-3f1f699fa0d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:11:14 compute-0 nova_compute[117514]: 2025-10-08 19:11:14.379 2 DEBUG oslo_concurrency.lockutils [req-170bf41f-a65f-4b98-b4db-2f2e975c59bf req-6cc8365b-dc88-4f15-b73c-4cd5c87feb71 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-783f8889-2bc8-4641-bdb9-95ee4226a2fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:11:15 compute-0 nova_compute[117514]: 2025-10-08 19:11:15.987 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:11:17 compute-0 nova_compute[117514]: 2025-10-08 19:11:17.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:11:18 compute-0 nova_compute[117514]: 2025-10-08 19:11:18.129 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:11:18 compute-0 nova_compute[117514]: 2025-10-08 19:11:18.150 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Triggering sync for uuid 783f8889-2bc8-4641-bdb9-95ee4226a2fd _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Oct 08 19:11:18 compute-0 nova_compute[117514]: 2025-10-08 19:11:18.151 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Triggering sync for uuid 6af51230-93a7-45ef-9a1e-c47302f43bcf _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Oct 08 19:11:18 compute-0 nova_compute[117514]: 2025-10-08 19:11:18.151 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:11:18 compute-0 nova_compute[117514]: 2025-10-08 19:11:18.152 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:11:18 compute-0 nova_compute[117514]: 2025-10-08 19:11:18.152 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "6af51230-93a7-45ef-9a1e-c47302f43bcf" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:11:18 compute-0 nova_compute[117514]: 2025-10-08 19:11:18.153 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "6af51230-93a7-45ef-9a1e-c47302f43bcf" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:11:18 compute-0 nova_compute[117514]: 2025-10-08 19:11:18.229 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.078s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:11:18 compute-0 nova_compute[117514]: 2025-10-08 19:11:18.231 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "6af51230-93a7-45ef-9a1e-c47302f43bcf" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.078s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:11:21 compute-0 nova_compute[117514]: 2025-10-08 19:11:21.016 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:11:22 compute-0 nova_compute[117514]: 2025-10-08 19:11:22.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:11:23 compute-0 podman[148096]: 2025-10-08 19:11:23.650455211 +0000 UTC m=+0.074246939 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, config_id=edpm, tcib_managed=true)
Oct 08 19:11:26 compute-0 nova_compute[117514]: 2025-10-08 19:11:26.019 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:11:27 compute-0 nova_compute[117514]: 2025-10-08 19:11:27.311 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:11:29 compute-0 podman[148118]: 2025-10-08 19:11:29.637572011 +0000 UTC m=+0.061353377 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Oct 08 19:11:29 compute-0 podman[148117]: 2025-10-08 19:11:29.652325578 +0000 UTC m=+0.069917215 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., release=1755695350, version=9.6, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, build-date=2025-08-20T13:12:41, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, name=ubi9-minimal, container_name=openstack_network_exporter)
Oct 08 19:11:31 compute-0 nova_compute[117514]: 2025-10-08 19:11:31.021 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:11:32 compute-0 nova_compute[117514]: 2025-10-08 19:11:32.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:11:32 compute-0 podman[148158]: 2025-10-08 19:11:32.651551792 +0000 UTC m=+0.076879844 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 08 19:11:35 compute-0 podman[148184]: 2025-10-08 19:11:35.649262663 +0000 UTC m=+0.069383789 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct 08 19:11:35 compute-0 podman[148182]: 2025-10-08 19:11:35.679793957 +0000 UTC m=+0.098211943 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, tcib_managed=true)
Oct 08 19:11:35 compute-0 podman[148183]: 2025-10-08 19:11:35.709845866 +0000 UTC m=+0.125472532 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 08 19:11:36 compute-0 nova_compute[117514]: 2025-10-08 19:11:36.023 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:11:37 compute-0 nova_compute[117514]: 2025-10-08 19:11:37.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:11:41 compute-0 nova_compute[117514]: 2025-10-08 19:11:41.070 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:11:42 compute-0 nova_compute[117514]: 2025-10-08 19:11:42.320 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:11:43 compute-0 ovn_controller[19759]: 2025-10-08T19:11:43Z|00098|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Oct 08 19:11:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:11:44.232 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:11:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:11:44.233 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:11:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:11:44.235 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:11:44 compute-0 podman[148242]: 2025-10-08 19:11:44.668697295 +0000 UTC m=+0.083457646 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 08 19:11:46 compute-0 nova_compute[117514]: 2025-10-08 19:11:46.074 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:11:47 compute-0 nova_compute[117514]: 2025-10-08 19:11:47.325 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:11:51 compute-0 nova_compute[117514]: 2025-10-08 19:11:51.126 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:11:52 compute-0 nova_compute[117514]: 2025-10-08 19:11:52.360 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:11:54 compute-0 podman[148266]: 2025-10-08 19:11:54.666222163 +0000 UTC m=+0.090221532 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 08 19:11:56 compute-0 nova_compute[117514]: 2025-10-08 19:11:56.129 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:11:57 compute-0 nova_compute[117514]: 2025-10-08 19:11:57.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:11:57 compute-0 nova_compute[117514]: 2025-10-08 19:11:57.976 2 DEBUG oslo_concurrency.lockutils [None req-8ba7d0d6-2bc3-4a73-9d4d-cf1898b3aebc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "6af51230-93a7-45ef-9a1e-c47302f43bcf" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:11:57 compute-0 nova_compute[117514]: 2025-10-08 19:11:57.977 2 DEBUG oslo_concurrency.lockutils [None req-8ba7d0d6-2bc3-4a73-9d4d-cf1898b3aebc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "6af51230-93a7-45ef-9a1e-c47302f43bcf" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:11:57 compute-0 nova_compute[117514]: 2025-10-08 19:11:57.977 2 DEBUG oslo_concurrency.lockutils [None req-8ba7d0d6-2bc3-4a73-9d4d-cf1898b3aebc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "6af51230-93a7-45ef-9a1e-c47302f43bcf-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:11:57 compute-0 nova_compute[117514]: 2025-10-08 19:11:57.978 2 DEBUG oslo_concurrency.lockutils [None req-8ba7d0d6-2bc3-4a73-9d4d-cf1898b3aebc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "6af51230-93a7-45ef-9a1e-c47302f43bcf-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:11:57 compute-0 nova_compute[117514]: 2025-10-08 19:11:57.978 2 DEBUG oslo_concurrency.lockutils [None req-8ba7d0d6-2bc3-4a73-9d4d-cf1898b3aebc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "6af51230-93a7-45ef-9a1e-c47302f43bcf-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:11:57 compute-0 nova_compute[117514]: 2025-10-08 19:11:57.981 2 INFO nova.compute.manager [None req-8ba7d0d6-2bc3-4a73-9d4d-cf1898b3aebc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Terminating instance
Oct 08 19:11:57 compute-0 nova_compute[117514]: 2025-10-08 19:11:57.983 2 DEBUG nova.compute.manager [None req-8ba7d0d6-2bc3-4a73-9d4d-cf1898b3aebc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Oct 08 19:11:58 compute-0 kernel: tap062b16e8-3c (unregistering): left promiscuous mode
Oct 08 19:11:58 compute-0 NetworkManager[1035]: <info>  [1759950718.0188] device (tap062b16e8-3c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 08 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.028 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:11:58 compute-0 ovn_controller[19759]: 2025-10-08T19:11:58Z|00099|binding|INFO|Releasing lport 062b16e8-3c3b-4520-b0f8-536d588db2f5 from this chassis (sb_readonly=0)
Oct 08 19:11:58 compute-0 ovn_controller[19759]: 2025-10-08T19:11:58Z|00100|binding|INFO|Setting lport 062b16e8-3c3b-4520-b0f8-536d588db2f5 down in Southbound
Oct 08 19:11:58 compute-0 ovn_controller[19759]: 2025-10-08T19:11:58Z|00101|binding|INFO|Removing iface tap062b16e8-3c ovn-installed in OVS
Oct 08 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.030 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:11:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:11:58.045 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:49:be:8a 10.100.0.24'], port_security=['fa:16:3e:49:be:8a 10.100.0.24'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.24/28', 'neutron:device_id': '6af51230-93a7-45ef-9a1e-c47302f43bcf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e3bc67bd-dd21-4701-b445-33eb52179602', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a7ebd8cf-2e32-494a-bac7-d2c7c2ffc36a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=325bd26c-56bb-4683-8b62-92cc8f266207, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=062b16e8-3c3b-4520-b0f8-536d588db2f5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 08 19:11:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:11:58.047 28643 INFO neutron.agent.ovn.metadata.agent [-] Port 062b16e8-3c3b-4520-b0f8-536d588db2f5 in datapath e3bc67bd-dd21-4701-b445-33eb52179602 unbound from our chassis
Oct 08 19:11:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:11:58.049 28643 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e3bc67bd-dd21-4701-b445-33eb52179602
Oct 08 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.056 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:11:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:11:58.071 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[bc3585de-8852-4a28-9a58-bb432951b116]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:11:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:11:58.099 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[b0ffdab6-c181-481f-8975-4d4809daa520]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:11:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:11:58.101 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[78b97c2d-0ea7-44e6-b780-741c8661471d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:11:58 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000007.scope: Deactivated successfully.
Oct 08 19:11:58 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000007.scope: Consumed 14.599s CPU time.
Oct 08 19:11:58 compute-0 systemd-machined[77568]: Machine qemu-7-instance-00000007 terminated.
Oct 08 19:11:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:11:58.125 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[5287ee8a-0659-4662-a2fb-94117d2cbb60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:11:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:11:58.141 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[d3251fe8-f580-46b7-98d5-183049219694]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape3bc67bd-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5d:4f:d1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 15, 'tx_packets': 7, 'rx_bytes': 1222, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 15, 'tx_packets': 7, 'rx_bytes': 1222, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 130101, 'reachable_time': 29029, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 10, 'inoctets': 872, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 10, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 872, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 10, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 148299, 'error': None, 'target': 'ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:11:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:11:58.155 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[5fb39fda-bbb7-43f4-bf7a-3c8ccb532151]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tape3bc67bd-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 130118, 'tstamp': 130118}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 148300, 'error': None, 'target': 'ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.17'], ['IFA_LOCAL', '10.100.0.17'], ['IFA_BROADCAST', '10.100.0.31'], ['IFA_LABEL', 'tape3bc67bd-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 130123, 'tstamp': 130123}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 148300, 'error': None, 'target': 'ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:11:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:11:58.156 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape3bc67bd-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.158 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.161 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:11:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:11:58.162 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape3bc67bd-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:11:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:11:58.162 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 08 19:11:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:11:58.163 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape3bc67bd-d0, col_values=(('external_ids', {'iface-id': 'd935682a-e42a-4970-b54c-b54c616cf798'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:11:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:11:58.163 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 08 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.239 2 INFO nova.virt.libvirt.driver [-] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Instance destroyed successfully.
Oct 08 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.239 2 DEBUG nova.objects.instance [None req-8ba7d0d6-2bc3-4a73-9d4d-cf1898b3aebc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'resources' on Instance uuid 6af51230-93a7-45ef-9a1e-c47302f43bcf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 08 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.253 2 DEBUG nova.virt.libvirt.vif [None req-8ba7d0d6-2bc3-4a73-9d4d-cf1898b3aebc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T19:10:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1744393953',display_name='tempest-TestNetworkBasicOps-server-1744393953',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1744393953',id=7,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ/nGXxhsdJmWHabE3HFa5+3pmT1eGAFwd96u9XHC+whrqyLo5hIQAYJiUfXapQHjQsYnRIxe45Y0OXwPlQza5nnuSeUdl81Vlbahpy7snJ2RnOlPvASQfobelq2pqhHKA==',key_name='tempest-TestNetworkBasicOps-575443871',keypairs=<?>,launch_index=0,launched_at=2025-10-08T19:10:54Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-ke980fn8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T19:10:55Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=6af51230-93a7-45ef-9a1e-c47302f43bcf,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "062b16e8-3c3b-4520-b0f8-536d588db2f5", "address": "fa:16:3e:49:be:8a", "network": {"id": "e3bc67bd-dd21-4701-b445-33eb52179602", "bridge": "br-int", "label": "tempest-network-smoke--1834937033", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap062b16e8-3c", "ovs_interfaceid": "062b16e8-3c3b-4520-b0f8-536d588db2f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Oct 08 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.253 2 DEBUG nova.network.os_vif_util [None req-8ba7d0d6-2bc3-4a73-9d4d-cf1898b3aebc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "062b16e8-3c3b-4520-b0f8-536d588db2f5", "address": "fa:16:3e:49:be:8a", "network": {"id": "e3bc67bd-dd21-4701-b445-33eb52179602", "bridge": "br-int", "label": "tempest-network-smoke--1834937033", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap062b16e8-3c", "ovs_interfaceid": "062b16e8-3c3b-4520-b0f8-536d588db2f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 08 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.254 2 DEBUG nova.network.os_vif_util [None req-8ba7d0d6-2bc3-4a73-9d4d-cf1898b3aebc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:49:be:8a,bridge_name='br-int',has_traffic_filtering=True,id=062b16e8-3c3b-4520-b0f8-536d588db2f5,network=Network(e3bc67bd-dd21-4701-b445-33eb52179602),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap062b16e8-3c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 08 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.254 2 DEBUG os_vif [None req-8ba7d0d6-2bc3-4a73-9d4d-cf1898b3aebc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:49:be:8a,bridge_name='br-int',has_traffic_filtering=True,id=062b16e8-3c3b-4520-b0f8-536d588db2f5,network=Network(e3bc67bd-dd21-4701-b445-33eb52179602),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap062b16e8-3c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Oct 08 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.255 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.256 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap062b16e8-3c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.261 2 INFO os_vif [None req-8ba7d0d6-2bc3-4a73-9d4d-cf1898b3aebc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:49:be:8a,bridge_name='br-int',has_traffic_filtering=True,id=062b16e8-3c3b-4520-b0f8-536d588db2f5,network=Network(e3bc67bd-dd21-4701-b445-33eb52179602),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap062b16e8-3c')
Oct 08 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.261 2 INFO nova.virt.libvirt.driver [None req-8ba7d0d6-2bc3-4a73-9d4d-cf1898b3aebc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Deleting instance files /var/lib/nova/instances/6af51230-93a7-45ef-9a1e-c47302f43bcf_del
Oct 08 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.262 2 INFO nova.virt.libvirt.driver [None req-8ba7d0d6-2bc3-4a73-9d4d-cf1898b3aebc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Deletion of /var/lib/nova/instances/6af51230-93a7-45ef-9a1e-c47302f43bcf_del complete
Oct 08 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.323 2 INFO nova.compute.manager [None req-8ba7d0d6-2bc3-4a73-9d4d-cf1898b3aebc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Took 0.34 seconds to destroy the instance on the hypervisor.
Oct 08 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.324 2 DEBUG oslo.service.loopingcall [None req-8ba7d0d6-2bc3-4a73-9d4d-cf1898b3aebc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Oct 08 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.324 2 DEBUG nova.compute.manager [-] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Oct 08 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.325 2 DEBUG nova.network.neutron [-] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Oct 08 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.354 2 DEBUG nova.compute.manager [req-c88323bd-afff-41b0-8f1e-449dd222b828 req-5519c56f-b968-4ae8-9d5e-83ad54848912 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Received event network-vif-unplugged-062b16e8-3c3b-4520-b0f8-536d588db2f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.354 2 DEBUG oslo_concurrency.lockutils [req-c88323bd-afff-41b0-8f1e-449dd222b828 req-5519c56f-b968-4ae8-9d5e-83ad54848912 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "6af51230-93a7-45ef-9a1e-c47302f43bcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.355 2 DEBUG oslo_concurrency.lockutils [req-c88323bd-afff-41b0-8f1e-449dd222b828 req-5519c56f-b968-4ae8-9d5e-83ad54848912 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "6af51230-93a7-45ef-9a1e-c47302f43bcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.355 2 DEBUG oslo_concurrency.lockutils [req-c88323bd-afff-41b0-8f1e-449dd222b828 req-5519c56f-b968-4ae8-9d5e-83ad54848912 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "6af51230-93a7-45ef-9a1e-c47302f43bcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.355 2 DEBUG nova.compute.manager [req-c88323bd-afff-41b0-8f1e-449dd222b828 req-5519c56f-b968-4ae8-9d5e-83ad54848912 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] No waiting events found dispatching network-vif-unplugged-062b16e8-3c3b-4520-b0f8-536d588db2f5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 08 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.356 2 DEBUG nova.compute.manager [req-c88323bd-afff-41b0-8f1e-449dd222b828 req-5519c56f-b968-4ae8-9d5e-83ad54848912 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Received event network-vif-unplugged-062b16e8-3c3b-4520-b0f8-536d588db2f5 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Oct 08 19:11:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:11:58.434 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a6:75:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '5e:14:dd:63:55:2a'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 08 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.434 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:11:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:11:58.435 28643 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 08 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.812 2 DEBUG nova.network.neutron [-] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.833 2 INFO nova.compute.manager [-] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Took 0.51 seconds to deallocate network for instance.
Oct 08 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.884 2 DEBUG oslo_concurrency.lockutils [None req-8ba7d0d6-2bc3-4a73-9d4d-cf1898b3aebc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.885 2 DEBUG oslo_concurrency.lockutils [None req-8ba7d0d6-2bc3-4a73-9d4d-cf1898b3aebc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.889 2 DEBUG nova.compute.manager [req-144479d1-b1bc-429c-835c-fdbabf1c1230 req-24cdfdd0-ce63-4712-8288-cff53fd9846e bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Received event network-vif-deleted-062b16e8-3c3b-4520-b0f8-536d588db2f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.972 2 DEBUG nova.compute.provider_tree [None req-8ba7d0d6-2bc3-4a73-9d4d-cf1898b3aebc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 08 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.989 2 DEBUG nova.scheduler.client.report [None req-8ba7d0d6-2bc3-4a73-9d4d-cf1898b3aebc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 08 19:11:59 compute-0 nova_compute[117514]: 2025-10-08 19:11:59.024 2 DEBUG oslo_concurrency.lockutils [None req-8ba7d0d6-2bc3-4a73-9d4d-cf1898b3aebc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.139s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:11:59 compute-0 nova_compute[117514]: 2025-10-08 19:11:59.061 2 INFO nova.scheduler.client.report [None req-8ba7d0d6-2bc3-4a73-9d4d-cf1898b3aebc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Deleted allocations for instance 6af51230-93a7-45ef-9a1e-c47302f43bcf
Oct 08 19:11:59 compute-0 nova_compute[117514]: 2025-10-08 19:11:59.159 2 DEBUG oslo_concurrency.lockutils [None req-8ba7d0d6-2bc3-4a73-9d4d-cf1898b3aebc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "6af51230-93a7-45ef-9a1e-c47302f43bcf" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.182s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.439 2 DEBUG nova.compute.manager [req-4843d87d-c925-4a63-ab00-749832b02af7 req-0e9ea112-3ea9-458a-8679-775ed33b8318 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Received event network-vif-plugged-062b16e8-3c3b-4520-b0f8-536d588db2f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.440 2 DEBUG oslo_concurrency.lockutils [req-4843d87d-c925-4a63-ab00-749832b02af7 req-0e9ea112-3ea9-458a-8679-775ed33b8318 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "6af51230-93a7-45ef-9a1e-c47302f43bcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.441 2 DEBUG oslo_concurrency.lockutils [req-4843d87d-c925-4a63-ab00-749832b02af7 req-0e9ea112-3ea9-458a-8679-775ed33b8318 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "6af51230-93a7-45ef-9a1e-c47302f43bcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.441 2 DEBUG oslo_concurrency.lockutils [req-4843d87d-c925-4a63-ab00-749832b02af7 req-0e9ea112-3ea9-458a-8679-775ed33b8318 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "6af51230-93a7-45ef-9a1e-c47302f43bcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.441 2 DEBUG nova.compute.manager [req-4843d87d-c925-4a63-ab00-749832b02af7 req-0e9ea112-3ea9-458a-8679-775ed33b8318 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] No waiting events found dispatching network-vif-plugged-062b16e8-3c3b-4520-b0f8-536d588db2f5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 08 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.442 2 WARNING nova.compute.manager [req-4843d87d-c925-4a63-ab00-749832b02af7 req-0e9ea112-3ea9-458a-8679-775ed33b8318 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Received unexpected event network-vif-plugged-062b16e8-3c3b-4520-b0f8-536d588db2f5 for instance with vm_state deleted and task_state None.
Oct 08 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.445 2 DEBUG oslo_concurrency.lockutils [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "interface-783f8889-2bc8-4641-bdb9-95ee4226a2fd-ea81e5cb-74ba-43da-a780-3f1f699fa0d6" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.445 2 DEBUG oslo_concurrency.lockutils [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "interface-783f8889-2bc8-4641-bdb9-95ee4226a2fd-ea81e5cb-74ba-43da-a780-3f1f699fa0d6" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.472 2 DEBUG nova.objects.instance [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'flavor' on Instance uuid 783f8889-2bc8-4641-bdb9-95ee4226a2fd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 08 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.494 2 DEBUG nova.virt.libvirt.vif [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T19:09:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1641480242',display_name='tempest-TestNetworkBasicOps-server-1641480242',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1641480242',id=6,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPSyEE+QeB2DOtd7xoaY+J9mVl+DzPE43UDhso7eEGO9aQXs3wmj/YcqHfJ97lRUVFOa3dbwNiIUyunSI3DyzjQf/v6cjCZ2KkxRD0GJnQ0zRM5omnXaZRnz3Bq5VONa9g==',key_name='tempest-TestNetworkBasicOps-1535027603',keypairs=<?>,launch_index=0,launched_at=2025-10-08T19:09:58Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-pbt1zket',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T19:09:58Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=783f8889-2bc8-4641-bdb9-95ee4226a2fd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ea81e5cb-74ba-43da-a780-3f1f699fa0d6", "address": "fa:16:3e:11:ac:ba", "network": {"id": "e3bc67bd-dd21-4701-b445-33eb52179602", "bridge": "br-int", "label": "tempest-network-smoke--1834937033", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea81e5cb-74", "ovs_interfaceid": "ea81e5cb-74ba-43da-a780-3f1f699fa0d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Oct 08 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.494 2 DEBUG nova.network.os_vif_util [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "ea81e5cb-74ba-43da-a780-3f1f699fa0d6", "address": "fa:16:3e:11:ac:ba", "network": {"id": "e3bc67bd-dd21-4701-b445-33eb52179602", "bridge": "br-int", "label": "tempest-network-smoke--1834937033", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea81e5cb-74", "ovs_interfaceid": "ea81e5cb-74ba-43da-a780-3f1f699fa0d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 08 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.495 2 DEBUG nova.network.os_vif_util [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:11:ac:ba,bridge_name='br-int',has_traffic_filtering=True,id=ea81e5cb-74ba-43da-a780-3f1f699fa0d6,network=Network(e3bc67bd-dd21-4701-b445-33eb52179602),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea81e5cb-74') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 08 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.500 2 DEBUG nova.virt.libvirt.guest [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:11:ac:ba"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapea81e5cb-74"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Oct 08 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.502 2 DEBUG nova.virt.libvirt.guest [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:11:ac:ba"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapea81e5cb-74"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Oct 08 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.504 2 DEBUG nova.virt.libvirt.driver [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Attempting to detach device tapea81e5cb-74 from instance 783f8889-2bc8-4641-bdb9-95ee4226a2fd from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487
Oct 08 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.505 2 DEBUG nova.virt.libvirt.guest [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] detach device xml: <interface type="ethernet">
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <mac address="fa:16:3e:11:ac:ba"/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <model type="virtio"/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <driver name="vhost" rx_queue_size="512"/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <mtu size="1442"/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <target dev="tapea81e5cb-74"/>
Oct 08 19:12:00 compute-0 nova_compute[117514]: </interface>
Oct 08 19:12:00 compute-0 nova_compute[117514]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Oct 08 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.514 2 DEBUG nova.virt.libvirt.guest [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:11:ac:ba"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapea81e5cb-74"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Oct 08 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.518 2 DEBUG nova.virt.libvirt.guest [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:11:ac:ba"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapea81e5cb-74"/></interface>not found in domain: <domain type='kvm' id='6'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <name>instance-00000006</name>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <uuid>783f8889-2bc8-4641-bdb9-95ee4226a2fd</uuid>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <metadata>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <nova:name>tempest-TestNetworkBasicOps-server-1641480242</nova:name>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <nova:creationTime>2025-10-08 19:10:30</nova:creationTime>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <nova:flavor name="m1.nano">
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <nova:memory>128</nova:memory>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <nova:disk>1</nova:disk>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <nova:swap>0</nova:swap>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <nova:ephemeral>0</nova:ephemeral>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <nova:vcpus>1</nova:vcpus>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   </nova:flavor>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <nova:owner>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <nova:user uuid="efdb1424acdb478684cdb088b373ba05">tempest-TestNetworkBasicOps-1122149477-project-member</nova:user>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <nova:project uuid="b7f7c752a9c5498f8eda73e461895ac9">tempest-TestNetworkBasicOps-1122149477</nova:project>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   </nova:owner>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <nova:root type="image" uuid="23cfa426-7011-4566-992d-1c7af39f70dd"/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <nova:ports>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <nova:port uuid="bfb32e9e-52b6-4043-b9a6-129d11fa2814">
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </nova:port>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <nova:port uuid="ea81e5cb-74ba-43da-a780-3f1f699fa0d6">
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <nova:ip type="fixed" address="10.100.0.22" ipVersion="4"/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </nova:port>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   </nova:ports>
Oct 08 19:12:00 compute-0 nova_compute[117514]: </nova:instance>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   </metadata>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <memory unit='KiB'>131072</memory>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <currentMemory unit='KiB'>131072</currentMemory>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <vcpu placement='static'>1</vcpu>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <resource>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <partition>/machine</partition>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   </resource>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <sysinfo type='smbios'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <system>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <entry name='manufacturer'>RDO</entry>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <entry name='product'>OpenStack Compute</entry>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <entry name='serial'>783f8889-2bc8-4641-bdb9-95ee4226a2fd</entry>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <entry name='uuid'>783f8889-2bc8-4641-bdb9-95ee4226a2fd</entry>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <entry name='family'>Virtual Machine</entry>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </system>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   </sysinfo>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <os>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <type arch='x86_64' machine='pc-q35-rhel9.6.0'>hvm</type>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <boot dev='hd'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <smbios mode='sysinfo'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   </os>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <features>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <acpi/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <apic/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <vmcoreinfo state='on'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   </features>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <cpu mode='custom' match='exact' check='full'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <model fallback='forbid'>EPYC-Rome</model>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <vendor>AMD</vendor>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='require' name='x2apic'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='require' name='tsc-deadline'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='require' name='hypervisor'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='require' name='tsc_adjust'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='require' name='spec-ctrl'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='require' name='stibp'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='require' name='arch-capabilities'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='require' name='ssbd'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='require' name='cmp_legacy'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='require' name='overflow-recov'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='require' name='succor'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='require' name='ibrs'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='require' name='amd-ssbd'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='require' name='virt-ssbd'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='disable' name='lbrv'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='disable' name='tsc-scale'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='disable' name='vmcb-clean'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='disable' name='flushbyasid'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='disable' name='pause-filter'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='disable' name='pfthreshold'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='disable' name='svme-addr-chk'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='require' name='lfence-always-serializing'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='require' name='rdctl-no'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='require' name='mds-no'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='require' name='pschange-mc-no'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='require' name='gds-no'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='require' name='rfds-no'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='disable' name='xsaves'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='disable' name='svm'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='require' name='topoext'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='disable' name='npt'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='disable' name='nrip-save'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   </cpu>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <clock offset='utc'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <timer name='pit' tickpolicy='delay'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <timer name='rtc' tickpolicy='catchup'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <timer name='hpet' present='no'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   </clock>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <on_poweroff>destroy</on_poweroff>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <on_reboot>restart</on_reboot>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <on_crash>destroy</on_crash>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <devices>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <disk type='file' device='disk'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <driver name='qemu' type='qcow2' cache='none'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <source file='/var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk' index='2'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <backingStore type='file' index='3'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:         <format type='raw'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:         <source file='/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:         <backingStore/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       </backingStore>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target dev='vda' bus='virtio'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='virtio-disk0'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </disk>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <disk type='file' device='cdrom'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <driver name='qemu' type='raw' cache='none'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <source file='/var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk.config' index='1'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <backingStore/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target dev='sda' bus='sata'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <readonly/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='sata0-0-0'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </disk>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='pci' index='0' model='pcie-root'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='pcie.0'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='pci' index='1' model='pcie-root-port'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target chassis='1' port='0x10'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='pci.1'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='pci' index='2' model='pcie-root-port'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target chassis='2' port='0x11'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='pci.2'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='pci' index='3' model='pcie-root-port'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target chassis='3' port='0x12'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='pci.3'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='pci' index='4' model='pcie-root-port'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target chassis='4' port='0x13'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='pci.4'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='pci' index='5' model='pcie-root-port'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target chassis='5' port='0x14'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='pci.5'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='pci' index='6' model='pcie-root-port'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target chassis='6' port='0x15'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='pci.6'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='pci' index='7' model='pcie-root-port'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target chassis='7' port='0x16'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='pci.7'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='pci' index='8' model='pcie-root-port'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target chassis='8' port='0x17'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='pci.8'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='pci' index='9' model='pcie-root-port'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target chassis='9' port='0x18'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='pci.9'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='pci' index='10' model='pcie-root-port'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target chassis='10' port='0x19'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='pci.10'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='pci' index='11' model='pcie-root-port'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target chassis='11' port='0x1a'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='pci.11'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='pci' index='12' model='pcie-root-port'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target chassis='12' port='0x1b'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='pci.12'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='pci' index='13' model='pcie-root-port'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target chassis='13' port='0x1c'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='pci.13'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='pci' index='14' model='pcie-root-port'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target chassis='14' port='0x1d'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='pci.14'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='pci' index='15' model='pcie-root-port'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target chassis='15' port='0x1e'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='pci.15'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='pci' index='16' model='pcie-root-port'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target chassis='16' port='0x1f'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='pci.16'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='pci' index='17' model='pcie-root-port'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target chassis='17' port='0x20'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='pci.17'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='pci' index='18' model='pcie-root-port'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target chassis='18' port='0x21'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='pci.18'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='pci' index='19' model='pcie-root-port'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target chassis='19' port='0x22'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='pci.19'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='pci' index='20' model='pcie-root-port'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target chassis='20' port='0x23'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='pci.20'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='pci' index='21' model='pcie-root-port'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target chassis='21' port='0x24'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='pci.21'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='pci' index='22' model='pcie-root-port'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target chassis='22' port='0x25'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='pci.22'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='pci' index='23' model='pcie-root-port'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target chassis='23' port='0x26'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='pci.23'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='pci' index='24' model='pcie-root-port'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target chassis='24' port='0x27'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='pci.24'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='pci' index='25' model='pcie-root-port'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target chassis='25' port='0x28'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='pci.25'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model name='pcie-pci-bridge'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='pci.26'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='usb' index='0' model='piix3-uhci'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='usb'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='sata' index='0'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='ide'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <interface type='ethernet'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <mac address='fa:16:3e:4e:85:2e'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target dev='tapbfb32e9e-52'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model type='virtio'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <driver name='vhost' rx_queue_size='512'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <mtu size='1442'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='net0'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </interface>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <interface type='ethernet'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <mac address='fa:16:3e:11:ac:ba'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target dev='tapea81e5cb-74'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model type='virtio'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <driver name='vhost' rx_queue_size='512'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <mtu size='1442'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='net1'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </interface>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <serial type='pty'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <source path='/dev/pts/0'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <log file='/var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/console.log' append='off'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target type='isa-serial' port='0'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:         <model name='isa-serial'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       </target>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='serial0'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </serial>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <console type='pty' tty='/dev/pts/0'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <source path='/dev/pts/0'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <log file='/var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/console.log' append='off'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target type='serial' port='0'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='serial0'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </console>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <input type='tablet' bus='usb'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='input0'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='usb' bus='0' port='1'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </input>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <input type='mouse' bus='ps2'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='input1'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </input>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <input type='keyboard' bus='ps2'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='input2'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </input>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <listen type='address' address='::0'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </graphics>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <audio id='1' type='none'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <video>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model type='virtio' heads='1' primary='yes'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='video0'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </video>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <watchdog model='itco' action='reset'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='watchdog0'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </watchdog>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <memballoon model='virtio'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <stats period='10'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='balloon0'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </memballoon>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <rng model='virtio'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <backend model='random'>/dev/urandom</backend>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='rng0'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </rng>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   </devices>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <label>system_u:system_r:svirt_t:s0:c55,c685</label>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c55,c685</imagelabel>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   </seclabel>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <label>+107:+107</label>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <imagelabel>+107:+107</imagelabel>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   </seclabel>
Oct 08 19:12:00 compute-0 nova_compute[117514]: </domain>
Oct 08 19:12:00 compute-0 nova_compute[117514]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Oct 08 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.518 2 INFO nova.virt.libvirt.driver [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully detached device tapea81e5cb-74 from instance 783f8889-2bc8-4641-bdb9-95ee4226a2fd from the persistent domain config.
Oct 08 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.518 2 DEBUG nova.virt.libvirt.driver [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] (1/8): Attempting to detach device tapea81e5cb-74 with device alias net1 from instance 783f8889-2bc8-4641-bdb9-95ee4226a2fd from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523
Oct 08 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.518 2 DEBUG nova.virt.libvirt.guest [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] detach device xml: <interface type="ethernet">
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <mac address="fa:16:3e:11:ac:ba"/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <model type="virtio"/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <driver name="vhost" rx_queue_size="512"/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <mtu size="1442"/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <target dev="tapea81e5cb-74"/>
Oct 08 19:12:00 compute-0 nova_compute[117514]: </interface>
Oct 08 19:12:00 compute-0 nova_compute[117514]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Oct 08 19:12:00 compute-0 kernel: tapea81e5cb-74 (unregistering): left promiscuous mode
Oct 08 19:12:00 compute-0 NetworkManager[1035]: <info>  [1759950720.6153] device (tapea81e5cb-74): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 08 19:12:00 compute-0 ovn_controller[19759]: 2025-10-08T19:12:00Z|00102|binding|INFO|Releasing lport ea81e5cb-74ba-43da-a780-3f1f699fa0d6 from this chassis (sb_readonly=0)
Oct 08 19:12:00 compute-0 ovn_controller[19759]: 2025-10-08T19:12:00Z|00103|binding|INFO|Setting lport ea81e5cb-74ba-43da-a780-3f1f699fa0d6 down in Southbound
Oct 08 19:12:00 compute-0 ovn_controller[19759]: 2025-10-08T19:12:00Z|00104|binding|INFO|Removing iface tapea81e5cb-74 ovn-installed in OVS
Oct 08 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.663 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.665 2 DEBUG nova.virt.libvirt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Received event <DeviceRemovedEvent: 1759950720.6636052, 783f8889-2bc8-4641-bdb9-95ee4226a2fd => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370
Oct 08 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.667 2 DEBUG nova.virt.libvirt.driver [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Start waiting for the detach event from libvirt for device tapea81e5cb-74 with device alias net1 for instance 783f8889-2bc8-4641-bdb9-95ee4226a2fd _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599
Oct 08 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.668 2 DEBUG nova.virt.libvirt.guest [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:11:ac:ba"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapea81e5cb-74"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Oct 08 19:12:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:00.668 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:11:ac:ba 10.100.0.22', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.22/28', 'neutron:device_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e3bc67bd-dd21-4701-b445-33eb52179602', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=325bd26c-56bb-4683-8b62-92cc8f266207, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=ea81e5cb-74ba-43da-a780-3f1f699fa0d6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 08 19:12:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:00.669 28643 INFO neutron.agent.ovn.metadata.agent [-] Port ea81e5cb-74ba-43da-a780-3f1f699fa0d6 in datapath e3bc67bd-dd21-4701-b445-33eb52179602 unbound from our chassis
Oct 08 19:12:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:00.670 28643 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e3bc67bd-dd21-4701-b445-33eb52179602, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 08 19:12:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:00.671 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[cce2bf17-6b30-436e-a3dc-3f302d6d4a0d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:12:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:00.672 28643 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602 namespace which is not needed anymore
Oct 08 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.678 2 DEBUG nova.virt.libvirt.guest [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:11:ac:ba"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapea81e5cb-74"/></interface>not found in domain: <domain type='kvm' id='6'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <name>instance-00000006</name>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <uuid>783f8889-2bc8-4641-bdb9-95ee4226a2fd</uuid>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <metadata>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <nova:name>tempest-TestNetworkBasicOps-server-1641480242</nova:name>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <nova:creationTime>2025-10-08 19:10:30</nova:creationTime>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <nova:flavor name="m1.nano">
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <nova:memory>128</nova:memory>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <nova:disk>1</nova:disk>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <nova:swap>0</nova:swap>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <nova:ephemeral>0</nova:ephemeral>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <nova:vcpus>1</nova:vcpus>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   </nova:flavor>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <nova:owner>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <nova:user uuid="efdb1424acdb478684cdb088b373ba05">tempest-TestNetworkBasicOps-1122149477-project-member</nova:user>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <nova:project uuid="b7f7c752a9c5498f8eda73e461895ac9">tempest-TestNetworkBasicOps-1122149477</nova:project>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   </nova:owner>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <nova:root type="image" uuid="23cfa426-7011-4566-992d-1c7af39f70dd"/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <nova:ports>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <nova:port uuid="bfb32e9e-52b6-4043-b9a6-129d11fa2814">
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </nova:port>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <nova:port uuid="ea81e5cb-74ba-43da-a780-3f1f699fa0d6">
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <nova:ip type="fixed" address="10.100.0.22" ipVersion="4"/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </nova:port>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   </nova:ports>
Oct 08 19:12:00 compute-0 nova_compute[117514]: </nova:instance>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   </metadata>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <memory unit='KiB'>131072</memory>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <currentMemory unit='KiB'>131072</currentMemory>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <vcpu placement='static'>1</vcpu>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <resource>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <partition>/machine</partition>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   </resource>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <sysinfo type='smbios'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <system>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <entry name='manufacturer'>RDO</entry>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <entry name='product'>OpenStack Compute</entry>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <entry name='serial'>783f8889-2bc8-4641-bdb9-95ee4226a2fd</entry>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <entry name='uuid'>783f8889-2bc8-4641-bdb9-95ee4226a2fd</entry>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <entry name='family'>Virtual Machine</entry>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </system>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   </sysinfo>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <os>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <type arch='x86_64' machine='pc-q35-rhel9.6.0'>hvm</type>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <boot dev='hd'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <smbios mode='sysinfo'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   </os>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <features>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <acpi/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <apic/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <vmcoreinfo state='on'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   </features>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <cpu mode='custom' match='exact' check='full'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <model fallback='forbid'>EPYC-Rome</model>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <vendor>AMD</vendor>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='require' name='x2apic'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='require' name='tsc-deadline'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='require' name='hypervisor'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='require' name='tsc_adjust'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='require' name='spec-ctrl'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='require' name='stibp'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='require' name='arch-capabilities'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='require' name='ssbd'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='require' name='cmp_legacy'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='require' name='overflow-recov'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='require' name='succor'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='require' name='ibrs'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='require' name='amd-ssbd'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='require' name='virt-ssbd'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='disable' name='lbrv'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='disable' name='tsc-scale'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='disable' name='vmcb-clean'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='disable' name='flushbyasid'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='disable' name='pause-filter'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='disable' name='pfthreshold'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='disable' name='svme-addr-chk'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='require' name='lfence-always-serializing'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='require' name='rdctl-no'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='require' name='mds-no'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='require' name='pschange-mc-no'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='require' name='gds-no'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='require' name='rfds-no'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='disable' name='xsaves'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='disable' name='svm'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='require' name='topoext'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='disable' name='npt'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <feature policy='disable' name='nrip-save'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   </cpu>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <clock offset='utc'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <timer name='pit' tickpolicy='delay'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <timer name='rtc' tickpolicy='catchup'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <timer name='hpet' present='no'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   </clock>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <on_poweroff>destroy</on_poweroff>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <on_reboot>restart</on_reboot>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <on_crash>destroy</on_crash>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <devices>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <disk type='file' device='disk'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <driver name='qemu' type='qcow2' cache='none'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <source file='/var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk' index='2'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <backingStore type='file' index='3'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:         <format type='raw'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:         <source file='/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:         <backingStore/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       </backingStore>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target dev='vda' bus='virtio'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='virtio-disk0'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </disk>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <disk type='file' device='cdrom'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <driver name='qemu' type='raw' cache='none'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <source file='/var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk.config' index='1'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <backingStore/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target dev='sda' bus='sata'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <readonly/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='sata0-0-0'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </disk>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='pci' index='0' model='pcie-root'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='pcie.0'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='pci' index='1' model='pcie-root-port'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target chassis='1' port='0x10'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='pci.1'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='pci' index='2' model='pcie-root-port'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target chassis='2' port='0x11'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='pci.2'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='pci' index='3' model='pcie-root-port'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target chassis='3' port='0x12'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='pci.3'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='pci' index='4' model='pcie-root-port'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target chassis='4' port='0x13'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='pci.4'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='pci' index='5' model='pcie-root-port'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target chassis='5' port='0x14'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='pci.5'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='pci' index='6' model='pcie-root-port'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target chassis='6' port='0x15'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='pci.6'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='pci' index='7' model='pcie-root-port'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target chassis='7' port='0x16'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='pci.7'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='pci' index='8' model='pcie-root-port'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target chassis='8' port='0x17'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='pci.8'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='pci' index='9' model='pcie-root-port'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target chassis='9' port='0x18'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='pci.9'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='pci' index='10' model='pcie-root-port'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target chassis='10' port='0x19'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='pci.10'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='pci' index='11' model='pcie-root-port'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target chassis='11' port='0x1a'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='pci.11'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='pci' index='12' model='pcie-root-port'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target chassis='12' port='0x1b'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='pci.12'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='pci' index='13' model='pcie-root-port'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target chassis='13' port='0x1c'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='pci.13'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='pci' index='14' model='pcie-root-port'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target chassis='14' port='0x1d'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='pci.14'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='pci' index='15' model='pcie-root-port'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target chassis='15' port='0x1e'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='pci.15'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='pci' index='16' model='pcie-root-port'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target chassis='16' port='0x1f'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='pci.16'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='pci' index='17' model='pcie-root-port'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target chassis='17' port='0x20'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='pci.17'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='pci' index='18' model='pcie-root-port'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target chassis='18' port='0x21'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='pci.18'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='pci' index='19' model='pcie-root-port'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target chassis='19' port='0x22'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='pci.19'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='pci' index='20' model='pcie-root-port'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target chassis='20' port='0x23'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='pci.20'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='pci' index='21' model='pcie-root-port'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target chassis='21' port='0x24'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='pci.21'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='pci' index='22' model='pcie-root-port'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target chassis='22' port='0x25'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='pci.22'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='pci' index='23' model='pcie-root-port'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target chassis='23' port='0x26'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='pci.23'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='pci' index='24' model='pcie-root-port'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target chassis='24' port='0x27'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='pci.24'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='pci' index='25' model='pcie-root-port'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model name='pcie-root-port'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target chassis='25' port='0x28'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='pci.25'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model name='pcie-pci-bridge'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='pci.26'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='usb' index='0' model='piix3-uhci'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='usb'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <controller type='sata' index='0'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='ide'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </controller>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <interface type='ethernet'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <mac address='fa:16:3e:4e:85:2e'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target dev='tapbfb32e9e-52'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model type='virtio'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <driver name='vhost' rx_queue_size='512'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <mtu size='1442'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='net0'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </interface>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <serial type='pty'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <source path='/dev/pts/0'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <log file='/var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/console.log' append='off'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target type='isa-serial' port='0'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:         <model name='isa-serial'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       </target>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='serial0'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </serial>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <console type='pty' tty='/dev/pts/0'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <source path='/dev/pts/0'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <log file='/var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/console.log' append='off'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <target type='serial' port='0'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='serial0'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </console>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <input type='tablet' bus='usb'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='input0'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='usb' bus='0' port='1'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </input>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <input type='mouse' bus='ps2'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='input1'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </input>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <input type='keyboard' bus='ps2'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='input2'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </input>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <listen type='address' address='::0'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </graphics>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <audio id='1' type='none'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <video>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <model type='virtio' heads='1' primary='yes'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='video0'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </video>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <watchdog model='itco' action='reset'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='watchdog0'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </watchdog>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <memballoon model='virtio'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <stats period='10'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='balloon0'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </memballoon>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <rng model='virtio'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <backend model='random'>/dev/urandom</backend>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <alias name='rng0'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </rng>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   </devices>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <label>system_u:system_r:svirt_t:s0:c55,c685</label>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c55,c685</imagelabel>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   </seclabel>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <label>+107:+107</label>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <imagelabel>+107:+107</imagelabel>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   </seclabel>
Oct 08 19:12:00 compute-0 nova_compute[117514]: </domain>
Oct 08 19:12:00 compute-0 nova_compute[117514]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Oct 08 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.678 2 INFO nova.virt.libvirt.driver [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully detached device tapea81e5cb-74 from instance 783f8889-2bc8-4641-bdb9-95ee4226a2fd from the live domain config.
Oct 08 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.680 2 DEBUG nova.virt.libvirt.vif [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T19:09:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1641480242',display_name='tempest-TestNetworkBasicOps-server-1641480242',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1641480242',id=6,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPSyEE+QeB2DOtd7xoaY+J9mVl+DzPE43UDhso7eEGO9aQXs3wmj/YcqHfJ97lRUVFOa3dbwNiIUyunSI3DyzjQf/v6cjCZ2KkxRD0GJnQ0zRM5omnXaZRnz3Bq5VONa9g==',key_name='tempest-TestNetworkBasicOps-1535027603',keypairs=<?>,launch_index=0,launched_at=2025-10-08T19:09:58Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-pbt1zket',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T19:09:58Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=783f8889-2bc8-4641-bdb9-95ee4226a2fd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ea81e5cb-74ba-43da-a780-3f1f699fa0d6", "address": "fa:16:3e:11:ac:ba", "network": {"id": "e3bc67bd-dd21-4701-b445-33eb52179602", "bridge": "br-int", "label": "tempest-network-smoke--1834937033", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea81e5cb-74", "ovs_interfaceid": "ea81e5cb-74ba-43da-a780-3f1f699fa0d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Oct 08 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.681 2 DEBUG nova.network.os_vif_util [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "ea81e5cb-74ba-43da-a780-3f1f699fa0d6", "address": "fa:16:3e:11:ac:ba", "network": {"id": "e3bc67bd-dd21-4701-b445-33eb52179602", "bridge": "br-int", "label": "tempest-network-smoke--1834937033", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea81e5cb-74", "ovs_interfaceid": "ea81e5cb-74ba-43da-a780-3f1f699fa0d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 08 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.682 2 DEBUG nova.network.os_vif_util [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:11:ac:ba,bridge_name='br-int',has_traffic_filtering=True,id=ea81e5cb-74ba-43da-a780-3f1f699fa0d6,network=Network(e3bc67bd-dd21-4701-b445-33eb52179602),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea81e5cb-74') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 08 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.683 2 DEBUG os_vif [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:11:ac:ba,bridge_name='br-int',has_traffic_filtering=True,id=ea81e5cb-74ba-43da-a780-3f1f699fa0d6,network=Network(e3bc67bd-dd21-4701-b445-33eb52179602),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea81e5cb-74') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Oct 08 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.685 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.685 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapea81e5cb-74, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.690 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 08 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.692 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.693 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.695 2 INFO os_vif [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:11:ac:ba,bridge_name='br-int',has_traffic_filtering=True,id=ea81e5cb-74ba-43da-a780-3f1f699fa0d6,network=Network(e3bc67bd-dd21-4701-b445-33eb52179602),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea81e5cb-74')
Oct 08 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.696 2 DEBUG nova.virt.libvirt.guest [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <nova:name>tempest-TestNetworkBasicOps-server-1641480242</nova:name>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <nova:creationTime>2025-10-08 19:12:00</nova:creationTime>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <nova:flavor name="m1.nano">
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <nova:memory>128</nova:memory>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <nova:disk>1</nova:disk>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <nova:swap>0</nova:swap>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <nova:ephemeral>0</nova:ephemeral>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <nova:vcpus>1</nova:vcpus>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   </nova:flavor>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <nova:owner>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <nova:user uuid="efdb1424acdb478684cdb088b373ba05">tempest-TestNetworkBasicOps-1122149477-project-member</nova:user>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <nova:project uuid="b7f7c752a9c5498f8eda73e461895ac9">tempest-TestNetworkBasicOps-1122149477</nova:project>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   </nova:owner>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <nova:root type="image" uuid="23cfa426-7011-4566-992d-1c7af39f70dd"/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   <nova:ports>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     <nova:port uuid="bfb32e9e-52b6-4043-b9a6-129d11fa2814">
Oct 08 19:12:00 compute-0 nova_compute[117514]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Oct 08 19:12:00 compute-0 nova_compute[117514]:     </nova:port>
Oct 08 19:12:00 compute-0 nova_compute[117514]:   </nova:ports>
Oct 08 19:12:00 compute-0 nova_compute[117514]: </nova:instance>
Oct 08 19:12:00 compute-0 nova_compute[117514]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Oct 08 19:12:00 compute-0 podman[148319]: 2025-10-08 19:12:00.719649533 +0000 UTC m=+0.129808978 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, version=9.6, architecture=x86_64, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, release=1755695350, distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9)
Oct 08 19:12:00 compute-0 podman[148320]: 2025-10-08 19:12:00.739531348 +0000 UTC m=+0.147768647 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 08 19:12:00 compute-0 neutron-haproxy-ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602[147719]: [NOTICE]   (147733) : haproxy version is 2.8.14-c23fe91
Oct 08 19:12:00 compute-0 neutron-haproxy-ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602[147719]: [NOTICE]   (147733) : path to executable is /usr/sbin/haproxy
Oct 08 19:12:00 compute-0 neutron-haproxy-ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602[147719]: [WARNING]  (147733) : Exiting Master process...
Oct 08 19:12:00 compute-0 neutron-haproxy-ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602[147719]: [ALERT]    (147733) : Current worker (147741) exited with code 143 (Terminated)
Oct 08 19:12:00 compute-0 neutron-haproxy-ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602[147719]: [WARNING]  (147733) : All workers exited. Exiting... (0)
Oct 08 19:12:00 compute-0 systemd[1]: libpod-0263a6d21769e3c38d37f8ef90b039ce1a54c225bd0e154d3b91c06548a70e40.scope: Deactivated successfully.
Oct 08 19:12:00 compute-0 conmon[147719]: conmon 0263a6d21769e3c38d37 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-0263a6d21769e3c38d37f8ef90b039ce1a54c225bd0e154d3b91c06548a70e40.scope/container/memory.events
Oct 08 19:12:00 compute-0 podman[148380]: 2025-10-08 19:12:00.945383306 +0000 UTC m=+0.152696930 container died 0263a6d21769e3c38d37f8ef90b039ce1a54c225bd0e154d3b91c06548a70e40 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3)
Oct 08 19:12:01 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0263a6d21769e3c38d37f8ef90b039ce1a54c225bd0e154d3b91c06548a70e40-userdata-shm.mount: Deactivated successfully.
Oct 08 19:12:01 compute-0 systemd[1]: var-lib-containers-storage-overlay-dbb48eabd75c06324d43161ee30ba9aa731d2058597c747932479a573242f471-merged.mount: Deactivated successfully.
Oct 08 19:12:01 compute-0 podman[148380]: 2025-10-08 19:12:01.110181054 +0000 UTC m=+0.317494648 container cleanup 0263a6d21769e3c38d37f8ef90b039ce1a54c225bd0e154d3b91c06548a70e40 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 08 19:12:01 compute-0 systemd[1]: libpod-conmon-0263a6d21769e3c38d37f8ef90b039ce1a54c225bd0e154d3b91c06548a70e40.scope: Deactivated successfully.
Oct 08 19:12:01 compute-0 nova_compute[117514]: 2025-10-08 19:12:01.131 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:01 compute-0 podman[148415]: 2025-10-08 19:12:01.174238368 +0000 UTC m=+0.045091556 container remove 0263a6d21769e3c38d37f8ef90b039ce1a54c225bd0e154d3b91c06548a70e40 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 08 19:12:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:01.183 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[aae42e50-311f-48fa-a3c9-1c9e15485cb7]: (4, ('Wed Oct  8 07:12:00 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602 (0263a6d21769e3c38d37f8ef90b039ce1a54c225bd0e154d3b91c06548a70e40)\n0263a6d21769e3c38d37f8ef90b039ce1a54c225bd0e154d3b91c06548a70e40\nWed Oct  8 07:12:01 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602 (0263a6d21769e3c38d37f8ef90b039ce1a54c225bd0e154d3b91c06548a70e40)\n0263a6d21769e3c38d37f8ef90b039ce1a54c225bd0e154d3b91c06548a70e40\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:12:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:01.185 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[12839b11-3b44-4157-b1e9-ec898d017753]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:12:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:01.186 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape3bc67bd-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:12:01 compute-0 nova_compute[117514]: 2025-10-08 19:12:01.188 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:01 compute-0 kernel: tape3bc67bd-d0: left promiscuous mode
Oct 08 19:12:01 compute-0 nova_compute[117514]: 2025-10-08 19:12:01.190 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:01.193 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[56735f69-be01-4d74-92f1-1dfdf4e200e7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:12:01 compute-0 nova_compute[117514]: 2025-10-08 19:12:01.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:01.237 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[c809fb2b-e047-4e16-a3df-daab626a0cf6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:12:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:01.239 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[92e5f1c3-8724-4f92-ae62-b9a4597d7ef2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:12:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:01.257 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[e10dbd20-ba9f-4bf0-91f5-c24603f812ed]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 130089, 'reachable_time': 29829, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 148429, 'error': None, 'target': 'ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:12:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:01.260 28783 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Oct 08 19:12:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:01.260 28783 DEBUG oslo.privsep.daemon [-] privsep: reply[94f37eed-b18c-4c7b-89b1-649a842c59c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:12:01 compute-0 systemd[1]: run-netns-ovnmeta\x2de3bc67bd\x2ddd21\x2d4701\x2db445\x2d33eb52179602.mount: Deactivated successfully.
Oct 08 19:12:01 compute-0 nova_compute[117514]: 2025-10-08 19:12:01.289 2 DEBUG nova.compute.manager [req-6a1e267f-3daf-4b78-a19b-302bd76702de req-e5448483-8e82-4711-abf2-a455b1ed91ca bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Received event network-vif-unplugged-ea81e5cb-74ba-43da-a780-3f1f699fa0d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:12:01 compute-0 nova_compute[117514]: 2025-10-08 19:12:01.289 2 DEBUG oslo_concurrency.lockutils [req-6a1e267f-3daf-4b78-a19b-302bd76702de req-e5448483-8e82-4711-abf2-a455b1ed91ca bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:12:01 compute-0 nova_compute[117514]: 2025-10-08 19:12:01.290 2 DEBUG oslo_concurrency.lockutils [req-6a1e267f-3daf-4b78-a19b-302bd76702de req-e5448483-8e82-4711-abf2-a455b1ed91ca bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:12:01 compute-0 nova_compute[117514]: 2025-10-08 19:12:01.290 2 DEBUG oslo_concurrency.lockutils [req-6a1e267f-3daf-4b78-a19b-302bd76702de req-e5448483-8e82-4711-abf2-a455b1ed91ca bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:12:01 compute-0 nova_compute[117514]: 2025-10-08 19:12:01.291 2 DEBUG nova.compute.manager [req-6a1e267f-3daf-4b78-a19b-302bd76702de req-e5448483-8e82-4711-abf2-a455b1ed91ca bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] No waiting events found dispatching network-vif-unplugged-ea81e5cb-74ba-43da-a780-3f1f699fa0d6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 08 19:12:01 compute-0 nova_compute[117514]: 2025-10-08 19:12:01.291 2 WARNING nova.compute.manager [req-6a1e267f-3daf-4b78-a19b-302bd76702de req-e5448483-8e82-4711-abf2-a455b1ed91ca bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Received unexpected event network-vif-unplugged-ea81e5cb-74ba-43da-a780-3f1f699fa0d6 for instance with vm_state active and task_state None.
Oct 08 19:12:01 compute-0 nova_compute[117514]: 2025-10-08 19:12:01.565 2 DEBUG oslo_concurrency.lockutils [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "refresh_cache-783f8889-2bc8-4641-bdb9-95ee4226a2fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:12:01 compute-0 nova_compute[117514]: 2025-10-08 19:12:01.566 2 DEBUG oslo_concurrency.lockutils [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquired lock "refresh_cache-783f8889-2bc8-4641-bdb9-95ee4226a2fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:12:01 compute-0 nova_compute[117514]: 2025-10-08 19:12:01.566 2 DEBUG nova.network.neutron [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 08 19:12:02 compute-0 ovn_controller[19759]: 2025-10-08T19:12:02Z|00105|binding|INFO|Releasing lport ef1b5170-2d11-4e01-98e4-310f59c22ecd from this chassis (sb_readonly=0)
Oct 08 19:12:02 compute-0 nova_compute[117514]: 2025-10-08 19:12:02.667 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.157 2 INFO nova.network.neutron [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Port ea81e5cb-74ba-43da-a780-3f1f699fa0d6 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Oct 08 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.158 2 DEBUG nova.network.neutron [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Updating instance_info_cache with network_info: [{"id": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "address": "fa:16:3e:4e:85:2e", "network": {"id": "0d073e98-c9f2-4b90-8237-84ff2fa99090", "bridge": "br-int", "label": "tempest-network-smoke--1785011615", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb32e9e-52", "ovs_interfaceid": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.177 2 DEBUG oslo_concurrency.lockutils [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Releasing lock "refresh_cache-783f8889-2bc8-4641-bdb9-95ee4226a2fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.201 2 DEBUG oslo_concurrency.lockutils [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "interface-783f8889-2bc8-4641-bdb9-95ee4226a2fd-ea81e5cb-74ba-43da-a780-3f1f699fa0d6" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 2.756s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.392 2 DEBUG nova.compute.manager [req-c9052887-e383-4b55-822a-7f514a42851a req-74fc0c1e-d429-4ca4-bc1e-d2c066760e93 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Received event network-vif-plugged-ea81e5cb-74ba-43da-a780-3f1f699fa0d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.393 2 DEBUG oslo_concurrency.lockutils [req-c9052887-e383-4b55-822a-7f514a42851a req-74fc0c1e-d429-4ca4-bc1e-d2c066760e93 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.393 2 DEBUG oslo_concurrency.lockutils [req-c9052887-e383-4b55-822a-7f514a42851a req-74fc0c1e-d429-4ca4-bc1e-d2c066760e93 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.394 2 DEBUG oslo_concurrency.lockutils [req-c9052887-e383-4b55-822a-7f514a42851a req-74fc0c1e-d429-4ca4-bc1e-d2c066760e93 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.394 2 DEBUG nova.compute.manager [req-c9052887-e383-4b55-822a-7f514a42851a req-74fc0c1e-d429-4ca4-bc1e-d2c066760e93 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] No waiting events found dispatching network-vif-plugged-ea81e5cb-74ba-43da-a780-3f1f699fa0d6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 08 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.395 2 WARNING nova.compute.manager [req-c9052887-e383-4b55-822a-7f514a42851a req-74fc0c1e-d429-4ca4-bc1e-d2c066760e93 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Received unexpected event network-vif-plugged-ea81e5cb-74ba-43da-a780-3f1f699fa0d6 for instance with vm_state active and task_state None.
Oct 08 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.395 2 DEBUG nova.compute.manager [req-c9052887-e383-4b55-822a-7f514a42851a req-74fc0c1e-d429-4ca4-bc1e-d2c066760e93 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Received event network-vif-deleted-ea81e5cb-74ba-43da-a780-3f1f699fa0d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.562 2 DEBUG nova.compute.manager [req-de0625b8-e44f-4bc1-a966-d6a52f2b1e9d req-c6626794-43f7-41b2-9e2f-64e413f363b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Received event network-changed-bfb32e9e-52b6-4043-b9a6-129d11fa2814 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.563 2 DEBUG nova.compute.manager [req-de0625b8-e44f-4bc1-a966-d6a52f2b1e9d req-c6626794-43f7-41b2-9e2f-64e413f363b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Refreshing instance network info cache due to event network-changed-bfb32e9e-52b6-4043-b9a6-129d11fa2814. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 08 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.563 2 DEBUG oslo_concurrency.lockutils [req-de0625b8-e44f-4bc1-a966-d6a52f2b1e9d req-c6626794-43f7-41b2-9e2f-64e413f363b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-783f8889-2bc8-4641-bdb9-95ee4226a2fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.563 2 DEBUG oslo_concurrency.lockutils [req-de0625b8-e44f-4bc1-a966-d6a52f2b1e9d req-c6626794-43f7-41b2-9e2f-64e413f363b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-783f8889-2bc8-4641-bdb9-95ee4226a2fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.563 2 DEBUG nova.network.neutron [req-de0625b8-e44f-4bc1-a966-d6a52f2b1e9d req-c6626794-43f7-41b2-9e2f-64e413f363b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Refreshing network info cache for port bfb32e9e-52b6-4043-b9a6-129d11fa2814 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 08 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.649 2 DEBUG oslo_concurrency.lockutils [None req-dca986c0-27e2-4eea-a468-53c725fbf59f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.650 2 DEBUG oslo_concurrency.lockutils [None req-dca986c0-27e2-4eea-a468-53c725fbf59f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.650 2 DEBUG oslo_concurrency.lockutils [None req-dca986c0-27e2-4eea-a468-53c725fbf59f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.651 2 DEBUG oslo_concurrency.lockutils [None req-dca986c0-27e2-4eea-a468-53c725fbf59f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.651 2 DEBUG oslo_concurrency.lockutils [None req-dca986c0-27e2-4eea-a468-53c725fbf59f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.653 2 INFO nova.compute.manager [None req-dca986c0-27e2-4eea-a468-53c725fbf59f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Terminating instance
Oct 08 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.655 2 DEBUG nova.compute.manager [None req-dca986c0-27e2-4eea-a468-53c725fbf59f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Oct 08 19:12:03 compute-0 kernel: tapbfb32e9e-52 (unregistering): left promiscuous mode
Oct 08 19:12:03 compute-0 NetworkManager[1035]: <info>  [1759950723.6837] device (tapbfb32e9e-52): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 08 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.697 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:03 compute-0 ovn_controller[19759]: 2025-10-08T19:12:03Z|00106|binding|INFO|Releasing lport bfb32e9e-52b6-4043-b9a6-129d11fa2814 from this chassis (sb_readonly=0)
Oct 08 19:12:03 compute-0 ovn_controller[19759]: 2025-10-08T19:12:03Z|00107|binding|INFO|Setting lport bfb32e9e-52b6-4043-b9a6-129d11fa2814 down in Southbound
Oct 08 19:12:03 compute-0 ovn_controller[19759]: 2025-10-08T19:12:03Z|00108|binding|INFO|Removing iface tapbfb32e9e-52 ovn-installed in OVS
Oct 08 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.701 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:03.707 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4e:85:2e 10.100.0.14'], port_security=['fa:16:3e:4e:85:2e 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0d073e98-c9f2-4b90-8237-84ff2fa99090', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e1f96720-345d-4fd7-8b5f-d68f6fe81454', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cd3b59ed-5967-491c-a3b5-d0ba2b165b15, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=bfb32e9e-52b6-4043-b9a6-129d11fa2814) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 08 19:12:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:03.709 28643 INFO neutron.agent.ovn.metadata.agent [-] Port bfb32e9e-52b6-4043-b9a6-129d11fa2814 in datapath 0d073e98-c9f2-4b90-8237-84ff2fa99090 unbound from our chassis
Oct 08 19:12:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:03.711 28643 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0d073e98-c9f2-4b90-8237-84ff2fa99090, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 08 19:12:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:03.712 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[f4e61a3f-647f-4223-b0a0-be0eb8ece5b0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:12:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:03.713 28643 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0d073e98-c9f2-4b90-8237-84ff2fa99090 namespace which is not needed anymore
Oct 08 19:12:03 compute-0 podman[148434]: 2025-10-08 19:12:03.722363078 +0000 UTC m=+0.128600853 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 08 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.732 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.741 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:12:03 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000006.scope: Deactivated successfully.
Oct 08 19:12:03 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000006.scope: Consumed 19.686s CPU time.
Oct 08 19:12:03 compute-0 systemd-machined[77568]: Machine qemu-6-instance-00000006 terminated.
Oct 08 19:12:03 compute-0 NetworkManager[1035]: <info>  [1759950723.8768] manager: (tapbfb32e9e-52): new Tun device (/org/freedesktop/NetworkManager/Devices/63)
Oct 08 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.878 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.886 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.921 2 INFO nova.virt.libvirt.driver [-] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Instance destroyed successfully.
Oct 08 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.921 2 DEBUG nova.objects.instance [None req-dca986c0-27e2-4eea-a468-53c725fbf59f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'resources' on Instance uuid 783f8889-2bc8-4641-bdb9-95ee4226a2fd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 08 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.938 2 DEBUG nova.virt.libvirt.vif [None req-dca986c0-27e2-4eea-a468-53c725fbf59f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T19:09:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1641480242',display_name='tempest-TestNetworkBasicOps-server-1641480242',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1641480242',id=6,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPSyEE+QeB2DOtd7xoaY+J9mVl+DzPE43UDhso7eEGO9aQXs3wmj/YcqHfJ97lRUVFOa3dbwNiIUyunSI3DyzjQf/v6cjCZ2KkxRD0GJnQ0zRM5omnXaZRnz3Bq5VONa9g==',key_name='tempest-TestNetworkBasicOps-1535027603',keypairs=<?>,launch_index=0,launched_at=2025-10-08T19:09:58Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-pbt1zket',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T19:09:58Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=783f8889-2bc8-4641-bdb9-95ee4226a2fd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "address": "fa:16:3e:4e:85:2e", "network": {"id": "0d073e98-c9f2-4b90-8237-84ff2fa99090", "bridge": "br-int", "label": "tempest-network-smoke--1785011615", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb32e9e-52", "ovs_interfaceid": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Oct 08 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.939 2 DEBUG nova.network.os_vif_util [None req-dca986c0-27e2-4eea-a468-53c725fbf59f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "address": "fa:16:3e:4e:85:2e", "network": {"id": "0d073e98-c9f2-4b90-8237-84ff2fa99090", "bridge": "br-int", "label": "tempest-network-smoke--1785011615", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb32e9e-52", "ovs_interfaceid": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 08 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.939 2 DEBUG nova.network.os_vif_util [None req-dca986c0-27e2-4eea-a468-53c725fbf59f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:4e:85:2e,bridge_name='br-int',has_traffic_filtering=True,id=bfb32e9e-52b6-4043-b9a6-129d11fa2814,network=Network(0d073e98-c9f2-4b90-8237-84ff2fa99090),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbfb32e9e-52') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 08 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.940 2 DEBUG os_vif [None req-dca986c0-27e2-4eea-a468-53c725fbf59f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:4e:85:2e,bridge_name='br-int',has_traffic_filtering=True,id=bfb32e9e-52b6-4043-b9a6-129d11fa2814,network=Network(0d073e98-c9f2-4b90-8237-84ff2fa99090),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbfb32e9e-52') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Oct 08 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.942 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.943 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbfb32e9e-52, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.945 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.948 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.951 2 INFO os_vif [None req-dca986c0-27e2-4eea-a468-53c725fbf59f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:4e:85:2e,bridge_name='br-int',has_traffic_filtering=True,id=bfb32e9e-52b6-4043-b9a6-129d11fa2814,network=Network(0d073e98-c9f2-4b90-8237-84ff2fa99090),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbfb32e9e-52')
Oct 08 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.952 2 INFO nova.virt.libvirt.driver [None req-dca986c0-27e2-4eea-a468-53c725fbf59f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Deleting instance files /var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd_del
Oct 08 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.953 2 INFO nova.virt.libvirt.driver [None req-dca986c0-27e2-4eea-a468-53c725fbf59f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Deletion of /var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd_del complete
Oct 08 19:12:03 compute-0 neutron-haproxy-ovnmeta-0d073e98-c9f2-4b90-8237-84ff2fa99090[147419]: [NOTICE]   (147423) : haproxy version is 2.8.14-c23fe91
Oct 08 19:12:03 compute-0 neutron-haproxy-ovnmeta-0d073e98-c9f2-4b90-8237-84ff2fa99090[147419]: [NOTICE]   (147423) : path to executable is /usr/sbin/haproxy
Oct 08 19:12:03 compute-0 neutron-haproxy-ovnmeta-0d073e98-c9f2-4b90-8237-84ff2fa99090[147419]: [WARNING]  (147423) : Exiting Master process...
Oct 08 19:12:03 compute-0 neutron-haproxy-ovnmeta-0d073e98-c9f2-4b90-8237-84ff2fa99090[147419]: [WARNING]  (147423) : Exiting Master process...
Oct 08 19:12:03 compute-0 neutron-haproxy-ovnmeta-0d073e98-c9f2-4b90-8237-84ff2fa99090[147419]: [ALERT]    (147423) : Current worker (147425) exited with code 143 (Terminated)
Oct 08 19:12:03 compute-0 neutron-haproxy-ovnmeta-0d073e98-c9f2-4b90-8237-84ff2fa99090[147419]: [WARNING]  (147423) : All workers exited. Exiting... (0)
Oct 08 19:12:03 compute-0 systemd[1]: libpod-a10c865d8f33b87218a52fa6d5b32a88534c0d9b9c0a166b926f6b4b8ef386fc.scope: Deactivated successfully.
Oct 08 19:12:03 compute-0 podman[148484]: 2025-10-08 19:12:03.972838737 +0000 UTC m=+0.117522042 container died a10c865d8f33b87218a52fa6d5b32a88534c0d9b9c0a166b926f6b4b8ef386fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-0d073e98-c9f2-4b90-8237-84ff2fa99090, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 08 19:12:04 compute-0 nova_compute[117514]: 2025-10-08 19:12:04.012 2 INFO nova.compute.manager [None req-dca986c0-27e2-4eea-a468-53c725fbf59f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Took 0.36 seconds to destroy the instance on the hypervisor.
Oct 08 19:12:04 compute-0 nova_compute[117514]: 2025-10-08 19:12:04.013 2 DEBUG oslo.service.loopingcall [None req-dca986c0-27e2-4eea-a468-53c725fbf59f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Oct 08 19:12:04 compute-0 nova_compute[117514]: 2025-10-08 19:12:04.013 2 DEBUG nova.compute.manager [-] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Oct 08 19:12:04 compute-0 nova_compute[117514]: 2025-10-08 19:12:04.013 2 DEBUG nova.network.neutron [-] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Oct 08 19:12:04 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a10c865d8f33b87218a52fa6d5b32a88534c0d9b9c0a166b926f6b4b8ef386fc-userdata-shm.mount: Deactivated successfully.
Oct 08 19:12:04 compute-0 systemd[1]: var-lib-containers-storage-overlay-3a1feba6530d5a46f34a3cb37ffae2c111c4760047000322a985a4db99d10005-merged.mount: Deactivated successfully.
Oct 08 19:12:04 compute-0 podman[148484]: 2025-10-08 19:12:04.057530407 +0000 UTC m=+0.202213722 container cleanup a10c865d8f33b87218a52fa6d5b32a88534c0d9b9c0a166b926f6b4b8ef386fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-0d073e98-c9f2-4b90-8237-84ff2fa99090, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct 08 19:12:04 compute-0 systemd[1]: libpod-conmon-a10c865d8f33b87218a52fa6d5b32a88534c0d9b9c0a166b926f6b4b8ef386fc.scope: Deactivated successfully.
Oct 08 19:12:04 compute-0 podman[148529]: 2025-10-08 19:12:04.142922969 +0000 UTC m=+0.056110035 container remove a10c865d8f33b87218a52fa6d5b32a88534c0d9b9c0a166b926f6b4b8ef386fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-0d073e98-c9f2-4b90-8237-84ff2fa99090, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 08 19:12:04 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:04.151 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[0705cef5-5347-423c-81a1-907173043981]: (4, ('Wed Oct  8 07:12:03 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-0d073e98-c9f2-4b90-8237-84ff2fa99090 (a10c865d8f33b87218a52fa6d5b32a88534c0d9b9c0a166b926f6b4b8ef386fc)\na10c865d8f33b87218a52fa6d5b32a88534c0d9b9c0a166b926f6b4b8ef386fc\nWed Oct  8 07:12:04 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-0d073e98-c9f2-4b90-8237-84ff2fa99090 (a10c865d8f33b87218a52fa6d5b32a88534c0d9b9c0a166b926f6b4b8ef386fc)\na10c865d8f33b87218a52fa6d5b32a88534c0d9b9c0a166b926f6b4b8ef386fc\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:12:04 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:04.153 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[f32e71f6-abf7-41a8-a869-1c2825262be4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:12:04 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:04.153 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0d073e98-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:12:04 compute-0 nova_compute[117514]: 2025-10-08 19:12:04.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:04 compute-0 kernel: tap0d073e98-c0: left promiscuous mode
Oct 08 19:12:04 compute-0 nova_compute[117514]: 2025-10-08 19:12:04.179 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:04 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:04.184 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[02956841-c220-44fa-990d-24034cc7f54a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:12:04 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:04.210 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[b9ac2b8d-b441-4549-b739-15143ce2b8ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:12:04 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:04.211 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[f1e961ab-fd12-46e2-a24b-74511d41797c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:12:04 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:04.226 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[3e033ace-347b-4a76-bb6e-4c942769317e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 126810, 'reachable_time': 22549, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 148544, 'error': None, 'target': 'ovnmeta-0d073e98-c9f2-4b90-8237-84ff2fa99090', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:12:04 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:04.228 28783 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0d073e98-c9f2-4b90-8237-84ff2fa99090 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Oct 08 19:12:04 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:04.228 28783 DEBUG oslo.privsep.daemon [-] privsep: reply[ed4393d4-631e-4003-97ce-23b78cf3a537]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:12:04 compute-0 systemd[1]: run-netns-ovnmeta\x2d0d073e98\x2dc9f2\x2d4b90\x2d8237\x2d84ff2fa99090.mount: Deactivated successfully.
Oct 08 19:12:04 compute-0 nova_compute[117514]: 2025-10-08 19:12:04.719 2 DEBUG nova.network.neutron [-] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:12:04 compute-0 nova_compute[117514]: 2025-10-08 19:12:04.737 2 INFO nova.compute.manager [-] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Took 0.72 seconds to deallocate network for instance.
Oct 08 19:12:04 compute-0 nova_compute[117514]: 2025-10-08 19:12:04.779 2 DEBUG oslo_concurrency.lockutils [None req-dca986c0-27e2-4eea-a468-53c725fbf59f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:12:04 compute-0 nova_compute[117514]: 2025-10-08 19:12:04.779 2 DEBUG oslo_concurrency.lockutils [None req-dca986c0-27e2-4eea-a468-53c725fbf59f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:12:04 compute-0 nova_compute[117514]: 2025-10-08 19:12:04.851 2 DEBUG nova.compute.provider_tree [None req-dca986c0-27e2-4eea-a468-53c725fbf59f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 08 19:12:04 compute-0 nova_compute[117514]: 2025-10-08 19:12:04.868 2 DEBUG nova.scheduler.client.report [None req-dca986c0-27e2-4eea-a468-53c725fbf59f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 08 19:12:04 compute-0 nova_compute[117514]: 2025-10-08 19:12:04.890 2 DEBUG oslo_concurrency.lockutils [None req-dca986c0-27e2-4eea-a468-53c725fbf59f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:12:04 compute-0 nova_compute[117514]: 2025-10-08 19:12:04.917 2 INFO nova.scheduler.client.report [None req-dca986c0-27e2-4eea-a468-53c725fbf59f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Deleted allocations for instance 783f8889-2bc8-4641-bdb9-95ee4226a2fd
Oct 08 19:12:04 compute-0 nova_compute[117514]: 2025-10-08 19:12:04.974 2 DEBUG oslo_concurrency.lockutils [None req-dca986c0-27e2-4eea-a468-53c725fbf59f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.324s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:12:05 compute-0 nova_compute[117514]: 2025-10-08 19:12:05.076 2 DEBUG nova.network.neutron [req-de0625b8-e44f-4bc1-a966-d6a52f2b1e9d req-c6626794-43f7-41b2-9e2f-64e413f363b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Updated VIF entry in instance network info cache for port bfb32e9e-52b6-4043-b9a6-129d11fa2814. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 08 19:12:05 compute-0 nova_compute[117514]: 2025-10-08 19:12:05.076 2 DEBUG nova.network.neutron [req-de0625b8-e44f-4bc1-a966-d6a52f2b1e9d req-c6626794-43f7-41b2-9e2f-64e413f363b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Updating instance_info_cache with network_info: [{"id": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "address": "fa:16:3e:4e:85:2e", "network": {"id": "0d073e98-c9f2-4b90-8237-84ff2fa99090", "bridge": "br-int", "label": "tempest-network-smoke--1785011615", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb32e9e-52", "ovs_interfaceid": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:12:05 compute-0 nova_compute[117514]: 2025-10-08 19:12:05.093 2 DEBUG oslo_concurrency.lockutils [req-de0625b8-e44f-4bc1-a966-d6a52f2b1e9d req-c6626794-43f7-41b2-9e2f-64e413f363b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-783f8889-2bc8-4641-bdb9-95ee4226a2fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:12:05 compute-0 nova_compute[117514]: 2025-10-08 19:12:05.502 2 DEBUG nova.compute.manager [req-740c0679-76dc-4002-8c68-aea806955f3f req-0f3bf598-edac-459f-9477-f1201e9d97f5 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Received event network-vif-unplugged-bfb32e9e-52b6-4043-b9a6-129d11fa2814 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:12:05 compute-0 nova_compute[117514]: 2025-10-08 19:12:05.503 2 DEBUG oslo_concurrency.lockutils [req-740c0679-76dc-4002-8c68-aea806955f3f req-0f3bf598-edac-459f-9477-f1201e9d97f5 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:12:05 compute-0 nova_compute[117514]: 2025-10-08 19:12:05.503 2 DEBUG oslo_concurrency.lockutils [req-740c0679-76dc-4002-8c68-aea806955f3f req-0f3bf598-edac-459f-9477-f1201e9d97f5 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:12:05 compute-0 nova_compute[117514]: 2025-10-08 19:12:05.503 2 DEBUG oslo_concurrency.lockutils [req-740c0679-76dc-4002-8c68-aea806955f3f req-0f3bf598-edac-459f-9477-f1201e9d97f5 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:12:05 compute-0 nova_compute[117514]: 2025-10-08 19:12:05.503 2 DEBUG nova.compute.manager [req-740c0679-76dc-4002-8c68-aea806955f3f req-0f3bf598-edac-459f-9477-f1201e9d97f5 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] No waiting events found dispatching network-vif-unplugged-bfb32e9e-52b6-4043-b9a6-129d11fa2814 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 08 19:12:05 compute-0 nova_compute[117514]: 2025-10-08 19:12:05.503 2 WARNING nova.compute.manager [req-740c0679-76dc-4002-8c68-aea806955f3f req-0f3bf598-edac-459f-9477-f1201e9d97f5 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Received unexpected event network-vif-unplugged-bfb32e9e-52b6-4043-b9a6-129d11fa2814 for instance with vm_state deleted and task_state None.
Oct 08 19:12:05 compute-0 nova_compute[117514]: 2025-10-08 19:12:05.504 2 DEBUG nova.compute.manager [req-740c0679-76dc-4002-8c68-aea806955f3f req-0f3bf598-edac-459f-9477-f1201e9d97f5 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Received event network-vif-plugged-bfb32e9e-52b6-4043-b9a6-129d11fa2814 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:12:05 compute-0 nova_compute[117514]: 2025-10-08 19:12:05.504 2 DEBUG oslo_concurrency.lockutils [req-740c0679-76dc-4002-8c68-aea806955f3f req-0f3bf598-edac-459f-9477-f1201e9d97f5 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:12:05 compute-0 nova_compute[117514]: 2025-10-08 19:12:05.504 2 DEBUG oslo_concurrency.lockutils [req-740c0679-76dc-4002-8c68-aea806955f3f req-0f3bf598-edac-459f-9477-f1201e9d97f5 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:12:05 compute-0 nova_compute[117514]: 2025-10-08 19:12:05.504 2 DEBUG oslo_concurrency.lockutils [req-740c0679-76dc-4002-8c68-aea806955f3f req-0f3bf598-edac-459f-9477-f1201e9d97f5 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:12:05 compute-0 nova_compute[117514]: 2025-10-08 19:12:05.504 2 DEBUG nova.compute.manager [req-740c0679-76dc-4002-8c68-aea806955f3f req-0f3bf598-edac-459f-9477-f1201e9d97f5 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] No waiting events found dispatching network-vif-plugged-bfb32e9e-52b6-4043-b9a6-129d11fa2814 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 08 19:12:05 compute-0 nova_compute[117514]: 2025-10-08 19:12:05.505 2 WARNING nova.compute.manager [req-740c0679-76dc-4002-8c68-aea806955f3f req-0f3bf598-edac-459f-9477-f1201e9d97f5 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Received unexpected event network-vif-plugged-bfb32e9e-52b6-4043-b9a6-129d11fa2814 for instance with vm_state deleted and task_state None.
Oct 08 19:12:05 compute-0 nova_compute[117514]: 2025-10-08 19:12:05.758 2 DEBUG nova.compute.manager [req-e86a8bd8-fa0c-4a22-ac7c-911cbd94655a req-a99c0ac0-97a1-4fb1-8cb7-0079fc7a690d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Received event network-vif-deleted-bfb32e9e-52b6-4043-b9a6-129d11fa2814 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:12:06 compute-0 nova_compute[117514]: 2025-10-08 19:12:06.133 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:06 compute-0 podman[148547]: 2025-10-08 19:12:06.689340659 +0000 UTC m=+0.089018837 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Oct 08 19:12:06 compute-0 podman[148545]: 2025-10-08 19:12:06.698988248 +0000 UTC m=+0.116542363 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=iscsid)
Oct 08 19:12:06 compute-0 podman[148546]: 2025-10-08 19:12:06.738472711 +0000 UTC m=+0.144543064 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 08 19:12:07 compute-0 nova_compute[117514]: 2025-10-08 19:12:07.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:12:07 compute-0 nova_compute[117514]: 2025-10-08 19:12:07.746 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:12:07 compute-0 nova_compute[117514]: 2025-10-08 19:12:07.747 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:12:07 compute-0 nova_compute[117514]: 2025-10-08 19:12:07.747 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:12:07 compute-0 nova_compute[117514]: 2025-10-08 19:12:07.748 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 08 19:12:07 compute-0 nova_compute[117514]: 2025-10-08 19:12:07.860 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:07 compute-0 nova_compute[117514]: 2025-10-08 19:12:07.932 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:07 compute-0 nova_compute[117514]: 2025-10-08 19:12:07.990 2 WARNING nova.virt.libvirt.driver [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 08 19:12:07 compute-0 nova_compute[117514]: 2025-10-08 19:12:07.992 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6096MB free_disk=73.41395950317383GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 08 19:12:07 compute-0 nova_compute[117514]: 2025-10-08 19:12:07.992 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:12:07 compute-0 nova_compute[117514]: 2025-10-08 19:12:07.993 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:12:08 compute-0 nova_compute[117514]: 2025-10-08 19:12:08.059 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 08 19:12:08 compute-0 nova_compute[117514]: 2025-10-08 19:12:08.060 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 08 19:12:08 compute-0 nova_compute[117514]: 2025-10-08 19:12:08.084 2 DEBUG nova.compute.provider_tree [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 08 19:12:08 compute-0 nova_compute[117514]: 2025-10-08 19:12:08.101 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 08 19:12:08 compute-0 nova_compute[117514]: 2025-10-08 19:12:08.124 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 08 19:12:08 compute-0 nova_compute[117514]: 2025-10-08 19:12:08.124 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:12:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:12:08.242 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:12:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:12:08.242 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:12:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:12:08.242 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:12:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:12:08.242 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:12:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:12:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:12:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:12:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:12:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:12:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:12:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:12:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:12:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:12:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:12:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:12:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:12:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:12:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:12:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:12:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:12:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:12:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:12:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:12:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:12:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:12:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:12:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:12:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:12:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:12:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:12:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:12:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:12:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:12:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:12:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:12:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:12:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:12:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:12:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:12:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:12:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:12:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:12:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:12:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:12:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:12:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:12:08 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:08.437 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f81f7a-64d8-418a-a74c-b879bd6deb83, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:12:08 compute-0 nova_compute[117514]: 2025-10-08 19:12:08.945 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:09 compute-0 nova_compute[117514]: 2025-10-08 19:12:09.121 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:12:09 compute-0 nova_compute[117514]: 2025-10-08 19:12:09.121 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:12:09 compute-0 nova_compute[117514]: 2025-10-08 19:12:09.122 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 08 19:12:09 compute-0 nova_compute[117514]: 2025-10-08 19:12:09.718 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:12:09 compute-0 nova_compute[117514]: 2025-10-08 19:12:09.718 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 08 19:12:09 compute-0 nova_compute[117514]: 2025-10-08 19:12:09.747 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 08 19:12:09 compute-0 nova_compute[117514]: 2025-10-08 19:12:09.748 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:12:09 compute-0 nova_compute[117514]: 2025-10-08 19:12:09.748 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:12:11 compute-0 nova_compute[117514]: 2025-10-08 19:12:11.169 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:12 compute-0 nova_compute[117514]: 2025-10-08 19:12:12.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:12:12 compute-0 nova_compute[117514]: 2025-10-08 19:12:12.718 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:12:13 compute-0 nova_compute[117514]: 2025-10-08 19:12:13.239 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759950718.237962, 6af51230-93a7-45ef-9a1e-c47302f43bcf => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 08 19:12:13 compute-0 nova_compute[117514]: 2025-10-08 19:12:13.240 2 INFO nova.compute.manager [-] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] VM Stopped (Lifecycle Event)
Oct 08 19:12:13 compute-0 nova_compute[117514]: 2025-10-08 19:12:13.266 2 DEBUG nova.compute.manager [None req-6015f091-6ebe-41ee-a8e3-9bae346d4c2f - - - - - -] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:12:13 compute-0 nova_compute[117514]: 2025-10-08 19:12:13.947 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:15 compute-0 podman[148613]: 2025-10-08 19:12:15.674344477 +0000 UTC m=+0.086946507 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 08 19:12:16 compute-0 nova_compute[117514]: 2025-10-08 19:12:16.171 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:18 compute-0 nova_compute[117514]: 2025-10-08 19:12:18.920 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759950723.9176078, 783f8889-2bc8-4641-bdb9-95ee4226a2fd => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 08 19:12:18 compute-0 nova_compute[117514]: 2025-10-08 19:12:18.921 2 INFO nova.compute.manager [-] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] VM Stopped (Lifecycle Event)
Oct 08 19:12:18 compute-0 nova_compute[117514]: 2025-10-08 19:12:18.947 2 DEBUG nova.compute.manager [None req-99b0ff47-cd86-4520-b735-4251f68d6515 - - - - - -] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:12:18 compute-0 nova_compute[117514]: 2025-10-08 19:12:18.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:21 compute-0 nova_compute[117514]: 2025-10-08 19:12:21.173 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:23 compute-0 nova_compute[117514]: 2025-10-08 19:12:23.954 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:25 compute-0 podman[148637]: 2025-10-08 19:12:25.636637426 +0000 UTC m=+0.060350248 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, org.label-schema.vendor=CentOS)
Oct 08 19:12:26 compute-0 nova_compute[117514]: 2025-10-08 19:12:26.177 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.141 2 DEBUG oslo_concurrency.lockutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "9064d639-63f3-422f-a67f-7a4dad8d2182" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.142 2 DEBUG oslo_concurrency.lockutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "9064d639-63f3-422f-a67f-7a4dad8d2182" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.159 2 DEBUG nova.compute.manager [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Oct 08 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.260 2 DEBUG oslo_concurrency.lockutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.261 2 DEBUG oslo_concurrency.lockutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.272 2 DEBUG nova.virt.hardware [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Oct 08 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.273 2 INFO nova.compute.claims [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Claim successful on node compute-0.ctlplane.example.com
Oct 08 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.377 2 DEBUG nova.compute.provider_tree [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 08 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.400 2 DEBUG nova.scheduler.client.report [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 08 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.427 2 DEBUG oslo_concurrency.lockutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.166s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.428 2 DEBUG nova.compute.manager [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Oct 08 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.517 2 DEBUG nova.compute.manager [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Oct 08 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.518 2 DEBUG nova.network.neutron [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Oct 08 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.535 2 INFO nova.virt.libvirt.driver [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 08 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.555 2 DEBUG nova.compute.manager [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Oct 08 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.632 2 DEBUG nova.compute.manager [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Oct 08 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.634 2 DEBUG nova.virt.libvirt.driver [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Oct 08 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.634 2 INFO nova.virt.libvirt.driver [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Creating image(s)
Oct 08 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.635 2 DEBUG oslo_concurrency.lockutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "/var/lib/nova/instances/9064d639-63f3-422f-a67f-7a4dad8d2182/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.635 2 DEBUG oslo_concurrency.lockutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "/var/lib/nova/instances/9064d639-63f3-422f-a67f-7a4dad8d2182/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.636 2 DEBUG oslo_concurrency.lockutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "/var/lib/nova/instances/9064d639-63f3-422f-a67f-7a4dad8d2182/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.649 2 DEBUG oslo_concurrency.processutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.715 2 DEBUG oslo_concurrency.processutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.716 2 DEBUG oslo_concurrency.lockutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "008eb3078b811ee47058b7252a820910c35fc6df" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.717 2 DEBUG oslo_concurrency.lockutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.733 2 DEBUG oslo_concurrency.processutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.793 2 DEBUG oslo_concurrency.processutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.794 2 DEBUG oslo_concurrency.processutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df,backing_fmt=raw /var/lib/nova/instances/9064d639-63f3-422f-a67f-7a4dad8d2182/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.843 2 DEBUG oslo_concurrency.processutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df,backing_fmt=raw /var/lib/nova/instances/9064d639-63f3-422f-a67f-7a4dad8d2182/disk 1073741824" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.844 2 DEBUG oslo_concurrency.lockutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.845 2 DEBUG oslo_concurrency.processutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.897 2 DEBUG oslo_concurrency.processutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.899 2 DEBUG nova.virt.disk.api [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Checking if we can resize image /var/lib/nova/instances/9064d639-63f3-422f-a67f-7a4dad8d2182/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Oct 08 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.899 2 DEBUG oslo_concurrency.processutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9064d639-63f3-422f-a67f-7a4dad8d2182/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.954 2 DEBUG oslo_concurrency.processutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9064d639-63f3-422f-a67f-7a4dad8d2182/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.956 2 DEBUG nova.virt.disk.api [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Cannot resize image /var/lib/nova/instances/9064d639-63f3-422f-a67f-7a4dad8d2182/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Oct 08 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.956 2 DEBUG nova.objects.instance [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'migration_context' on Instance uuid 9064d639-63f3-422f-a67f-7a4dad8d2182 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 08 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.958 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.981 2 DEBUG nova.virt.libvirt.driver [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Oct 08 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.982 2 DEBUG nova.virt.libvirt.driver [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Ensure instance console log exists: /var/lib/nova/instances/9064d639-63f3-422f-a67f-7a4dad8d2182/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Oct 08 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.982 2 DEBUG oslo_concurrency.lockutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.983 2 DEBUG oslo_concurrency.lockutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.983 2 DEBUG oslo_concurrency.lockutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:12:29 compute-0 nova_compute[117514]: 2025-10-08 19:12:29.338 2 DEBUG nova.policy [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Oct 08 19:12:31 compute-0 nova_compute[117514]: 2025-10-08 19:12:31.178 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:31 compute-0 nova_compute[117514]: 2025-10-08 19:12:31.321 2 DEBUG nova.network.neutron [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Successfully updated port: 8b1cf032-8f00-4a3b-a370-211b5b0ca4ce _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Oct 08 19:12:31 compute-0 nova_compute[117514]: 2025-10-08 19:12:31.336 2 DEBUG oslo_concurrency.lockutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "refresh_cache-9064d639-63f3-422f-a67f-7a4dad8d2182" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:12:31 compute-0 nova_compute[117514]: 2025-10-08 19:12:31.336 2 DEBUG oslo_concurrency.lockutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquired lock "refresh_cache-9064d639-63f3-422f-a67f-7a4dad8d2182" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:12:31 compute-0 nova_compute[117514]: 2025-10-08 19:12:31.337 2 DEBUG nova.network.neutron [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 08 19:12:31 compute-0 nova_compute[117514]: 2025-10-08 19:12:31.404 2 DEBUG nova.compute.manager [req-cb2ce92c-f7b2-40ba-a5c9-6eed785f55e5 req-1705a742-acbb-4a77-8261-c18cc1ae9158 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Received event network-changed-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:12:31 compute-0 nova_compute[117514]: 2025-10-08 19:12:31.405 2 DEBUG nova.compute.manager [req-cb2ce92c-f7b2-40ba-a5c9-6eed785f55e5 req-1705a742-acbb-4a77-8261-c18cc1ae9158 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Refreshing instance network info cache due to event network-changed-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 08 19:12:31 compute-0 nova_compute[117514]: 2025-10-08 19:12:31.405 2 DEBUG oslo_concurrency.lockutils [req-cb2ce92c-f7b2-40ba-a5c9-6eed785f55e5 req-1705a742-acbb-4a77-8261-c18cc1ae9158 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-9064d639-63f3-422f-a67f-7a4dad8d2182" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:12:31 compute-0 nova_compute[117514]: 2025-10-08 19:12:31.466 2 DEBUG nova.network.neutron [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Oct 08 19:12:31 compute-0 podman[148673]: 2025-10-08 19:12:31.683053862 +0000 UTC m=+0.094780874 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct 08 19:12:31 compute-0 podman[148672]: 2025-10-08 19:12:31.688879641 +0000 UTC m=+0.099758018 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, distribution-scope=public, io.openshift.tags=minimal rhel9, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, managed_by=edpm_ansible, version=9.6, io.openshift.expose-services=, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7)
Oct 08 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.614 2 DEBUG nova.network.neutron [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Updating instance_info_cache with network_info: [{"id": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "address": "fa:16:3e:78:c5:0e", "network": {"id": "9bec1de5-8be3-4df6-b90a-943d76fedc48", "bridge": "br-int", "label": "tempest-network-smoke--1142604311", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b1cf032-8f", "ovs_interfaceid": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.642 2 DEBUG oslo_concurrency.lockutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Releasing lock "refresh_cache-9064d639-63f3-422f-a67f-7a4dad8d2182" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.642 2 DEBUG nova.compute.manager [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Instance network_info: |[{"id": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "address": "fa:16:3e:78:c5:0e", "network": {"id": "9bec1de5-8be3-4df6-b90a-943d76fedc48", "bridge": "br-int", "label": "tempest-network-smoke--1142604311", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b1cf032-8f", "ovs_interfaceid": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Oct 08 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.643 2 DEBUG oslo_concurrency.lockutils [req-cb2ce92c-f7b2-40ba-a5c9-6eed785f55e5 req-1705a742-acbb-4a77-8261-c18cc1ae9158 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-9064d639-63f3-422f-a67f-7a4dad8d2182" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.643 2 DEBUG nova.network.neutron [req-cb2ce92c-f7b2-40ba-a5c9-6eed785f55e5 req-1705a742-acbb-4a77-8261-c18cc1ae9158 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Refreshing network info cache for port 8b1cf032-8f00-4a3b-a370-211b5b0ca4ce _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 08 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.646 2 DEBUG nova.virt.libvirt.driver [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Start _get_guest_xml network_info=[{"id": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "address": "fa:16:3e:78:c5:0e", "network": {"id": "9bec1de5-8be3-4df6-b90a-943d76fedc48", "bridge": "br-int", "label": "tempest-network-smoke--1142604311", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b1cf032-8f", "ovs_interfaceid": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T19:05:11Z,direct_url=<?>,disk_format='qcow2',id=23cfa426-7011-4566-992d-1c7af39f70dd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0776a2a010754884a7b224f3b08ef53b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T19:05:12Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'guest_format': None, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_options': None, 'image_id': '23cfa426-7011-4566-992d-1c7af39f70dd'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Oct 08 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.652 2 WARNING nova.virt.libvirt.driver [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 08 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.659 2 DEBUG nova.virt.libvirt.host [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Oct 08 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.660 2 DEBUG nova.virt.libvirt.host [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Oct 08 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.663 2 DEBUG nova.virt.libvirt.host [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Oct 08 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.664 2 DEBUG nova.virt.libvirt.host [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Oct 08 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.664 2 DEBUG nova.virt.libvirt.driver [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Oct 08 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.664 2 DEBUG nova.virt.hardware [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T19:05:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='e8a148fc-4419-4813-98ff-a17e2a95609e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T19:05:11Z,direct_url=<?>,disk_format='qcow2',id=23cfa426-7011-4566-992d-1c7af39f70dd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0776a2a010754884a7b224f3b08ef53b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T19:05:12Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Oct 08 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.665 2 DEBUG nova.virt.hardware [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Oct 08 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.665 2 DEBUG nova.virt.hardware [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Oct 08 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.665 2 DEBUG nova.virt.hardware [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Oct 08 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.666 2 DEBUG nova.virt.hardware [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Oct 08 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.666 2 DEBUG nova.virt.hardware [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Oct 08 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.666 2 DEBUG nova.virt.hardware [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Oct 08 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.666 2 DEBUG nova.virt.hardware [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Oct 08 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.667 2 DEBUG nova.virt.hardware [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Oct 08 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.667 2 DEBUG nova.virt.hardware [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Oct 08 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.667 2 DEBUG nova.virt.hardware [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Oct 08 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.671 2 DEBUG nova.virt.libvirt.vif [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T19:12:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2063989498',display_name='tempest-TestNetworkBasicOps-server-2063989498',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2063989498',id=8,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAHrmLiAZgqvVf5RfnV+fV+6NQU3NOHOaBSGPHYj+myBWF2AbxEIt6spK8FUlXi8r+736xE5lbIw3NTujAKkT/2AVTAI40/9ASURZfXUfcM5xxB2Et9shqsazA/r0h6yOw==',key_name='tempest-TestNetworkBasicOps-2016672130',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-v8tc0ebw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T19:12:28Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=9064d639-63f3-422f-a67f-7a4dad8d2182,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "address": "fa:16:3e:78:c5:0e", "network": {"id": "9bec1de5-8be3-4df6-b90a-943d76fedc48", "bridge": "br-int", "label": "tempest-network-smoke--1142604311", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b1cf032-8f", "ovs_interfaceid": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Oct 08 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.671 2 DEBUG nova.network.os_vif_util [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "address": "fa:16:3e:78:c5:0e", "network": {"id": "9bec1de5-8be3-4df6-b90a-943d76fedc48", "bridge": "br-int", "label": "tempest-network-smoke--1142604311", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b1cf032-8f", "ovs_interfaceid": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 08 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.672 2 DEBUG nova.network.os_vif_util [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:78:c5:0e,bridge_name='br-int',has_traffic_filtering=True,id=8b1cf032-8f00-4a3b-a370-211b5b0ca4ce,network=Network(9bec1de5-8be3-4df6-b90a-943d76fedc48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8b1cf032-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 08 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.673 2 DEBUG nova.objects.instance [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9064d639-63f3-422f-a67f-7a4dad8d2182 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 08 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.687 2 DEBUG nova.virt.libvirt.driver [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] End _get_guest_xml xml=<domain type="kvm">
Oct 08 19:12:32 compute-0 nova_compute[117514]:   <uuid>9064d639-63f3-422f-a67f-7a4dad8d2182</uuid>
Oct 08 19:12:32 compute-0 nova_compute[117514]:   <name>instance-00000008</name>
Oct 08 19:12:32 compute-0 nova_compute[117514]:   <memory>131072</memory>
Oct 08 19:12:32 compute-0 nova_compute[117514]:   <vcpu>1</vcpu>
Oct 08 19:12:32 compute-0 nova_compute[117514]:   <metadata>
Oct 08 19:12:32 compute-0 nova_compute[117514]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 08 19:12:32 compute-0 nova_compute[117514]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct 08 19:12:32 compute-0 nova_compute[117514]:       <nova:name>tempest-TestNetworkBasicOps-server-2063989498</nova:name>
Oct 08 19:12:32 compute-0 nova_compute[117514]:       <nova:creationTime>2025-10-08 19:12:32</nova:creationTime>
Oct 08 19:12:32 compute-0 nova_compute[117514]:       <nova:flavor name="m1.nano">
Oct 08 19:12:32 compute-0 nova_compute[117514]:         <nova:memory>128</nova:memory>
Oct 08 19:12:32 compute-0 nova_compute[117514]:         <nova:disk>1</nova:disk>
Oct 08 19:12:32 compute-0 nova_compute[117514]:         <nova:swap>0</nova:swap>
Oct 08 19:12:32 compute-0 nova_compute[117514]:         <nova:ephemeral>0</nova:ephemeral>
Oct 08 19:12:32 compute-0 nova_compute[117514]:         <nova:vcpus>1</nova:vcpus>
Oct 08 19:12:32 compute-0 nova_compute[117514]:       </nova:flavor>
Oct 08 19:12:32 compute-0 nova_compute[117514]:       <nova:owner>
Oct 08 19:12:32 compute-0 nova_compute[117514]:         <nova:user uuid="efdb1424acdb478684cdb088b373ba05">tempest-TestNetworkBasicOps-1122149477-project-member</nova:user>
Oct 08 19:12:32 compute-0 nova_compute[117514]:         <nova:project uuid="b7f7c752a9c5498f8eda73e461895ac9">tempest-TestNetworkBasicOps-1122149477</nova:project>
Oct 08 19:12:32 compute-0 nova_compute[117514]:       </nova:owner>
Oct 08 19:12:32 compute-0 nova_compute[117514]:       <nova:root type="image" uuid="23cfa426-7011-4566-992d-1c7af39f70dd"/>
Oct 08 19:12:32 compute-0 nova_compute[117514]:       <nova:ports>
Oct 08 19:12:32 compute-0 nova_compute[117514]:         <nova:port uuid="8b1cf032-8f00-4a3b-a370-211b5b0ca4ce">
Oct 08 19:12:32 compute-0 nova_compute[117514]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Oct 08 19:12:32 compute-0 nova_compute[117514]:         </nova:port>
Oct 08 19:12:32 compute-0 nova_compute[117514]:       </nova:ports>
Oct 08 19:12:32 compute-0 nova_compute[117514]:     </nova:instance>
Oct 08 19:12:32 compute-0 nova_compute[117514]:   </metadata>
Oct 08 19:12:32 compute-0 nova_compute[117514]:   <sysinfo type="smbios">
Oct 08 19:12:32 compute-0 nova_compute[117514]:     <system>
Oct 08 19:12:32 compute-0 nova_compute[117514]:       <entry name="manufacturer">RDO</entry>
Oct 08 19:12:32 compute-0 nova_compute[117514]:       <entry name="product">OpenStack Compute</entry>
Oct 08 19:12:32 compute-0 nova_compute[117514]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct 08 19:12:32 compute-0 nova_compute[117514]:       <entry name="serial">9064d639-63f3-422f-a67f-7a4dad8d2182</entry>
Oct 08 19:12:32 compute-0 nova_compute[117514]:       <entry name="uuid">9064d639-63f3-422f-a67f-7a4dad8d2182</entry>
Oct 08 19:12:32 compute-0 nova_compute[117514]:       <entry name="family">Virtual Machine</entry>
Oct 08 19:12:32 compute-0 nova_compute[117514]:     </system>
Oct 08 19:12:32 compute-0 nova_compute[117514]:   </sysinfo>
Oct 08 19:12:32 compute-0 nova_compute[117514]:   <os>
Oct 08 19:12:32 compute-0 nova_compute[117514]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 08 19:12:32 compute-0 nova_compute[117514]:     <boot dev="hd"/>
Oct 08 19:12:32 compute-0 nova_compute[117514]:     <smbios mode="sysinfo"/>
Oct 08 19:12:32 compute-0 nova_compute[117514]:   </os>
Oct 08 19:12:32 compute-0 nova_compute[117514]:   <features>
Oct 08 19:12:32 compute-0 nova_compute[117514]:     <acpi/>
Oct 08 19:12:32 compute-0 nova_compute[117514]:     <apic/>
Oct 08 19:12:32 compute-0 nova_compute[117514]:     <vmcoreinfo/>
Oct 08 19:12:32 compute-0 nova_compute[117514]:   </features>
Oct 08 19:12:32 compute-0 nova_compute[117514]:   <clock offset="utc">
Oct 08 19:12:32 compute-0 nova_compute[117514]:     <timer name="pit" tickpolicy="delay"/>
Oct 08 19:12:32 compute-0 nova_compute[117514]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 08 19:12:32 compute-0 nova_compute[117514]:     <timer name="hpet" present="no"/>
Oct 08 19:12:32 compute-0 nova_compute[117514]:   </clock>
Oct 08 19:12:32 compute-0 nova_compute[117514]:   <cpu mode="host-model" match="exact">
Oct 08 19:12:32 compute-0 nova_compute[117514]:     <topology sockets="1" cores="1" threads="1"/>
Oct 08 19:12:32 compute-0 nova_compute[117514]:   </cpu>
Oct 08 19:12:32 compute-0 nova_compute[117514]:   <devices>
Oct 08 19:12:32 compute-0 nova_compute[117514]:     <disk type="file" device="disk">
Oct 08 19:12:32 compute-0 nova_compute[117514]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 08 19:12:32 compute-0 nova_compute[117514]:       <source file="/var/lib/nova/instances/9064d639-63f3-422f-a67f-7a4dad8d2182/disk"/>
Oct 08 19:12:32 compute-0 nova_compute[117514]:       <target dev="vda" bus="virtio"/>
Oct 08 19:12:32 compute-0 nova_compute[117514]:     </disk>
Oct 08 19:12:32 compute-0 nova_compute[117514]:     <disk type="file" device="cdrom">
Oct 08 19:12:32 compute-0 nova_compute[117514]:       <driver name="qemu" type="raw" cache="none"/>
Oct 08 19:12:32 compute-0 nova_compute[117514]:       <source file="/var/lib/nova/instances/9064d639-63f3-422f-a67f-7a4dad8d2182/disk.config"/>
Oct 08 19:12:32 compute-0 nova_compute[117514]:       <target dev="sda" bus="sata"/>
Oct 08 19:12:32 compute-0 nova_compute[117514]:     </disk>
Oct 08 19:12:32 compute-0 nova_compute[117514]:     <interface type="ethernet">
Oct 08 19:12:32 compute-0 nova_compute[117514]:       <mac address="fa:16:3e:78:c5:0e"/>
Oct 08 19:12:32 compute-0 nova_compute[117514]:       <model type="virtio"/>
Oct 08 19:12:32 compute-0 nova_compute[117514]:       <driver name="vhost" rx_queue_size="512"/>
Oct 08 19:12:32 compute-0 nova_compute[117514]:       <mtu size="1442"/>
Oct 08 19:12:32 compute-0 nova_compute[117514]:       <target dev="tap8b1cf032-8f"/>
Oct 08 19:12:32 compute-0 nova_compute[117514]:     </interface>
Oct 08 19:12:32 compute-0 nova_compute[117514]:     <serial type="pty">
Oct 08 19:12:32 compute-0 nova_compute[117514]:       <log file="/var/lib/nova/instances/9064d639-63f3-422f-a67f-7a4dad8d2182/console.log" append="off"/>
Oct 08 19:12:32 compute-0 nova_compute[117514]:     </serial>
Oct 08 19:12:32 compute-0 nova_compute[117514]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 08 19:12:32 compute-0 nova_compute[117514]:     <video>
Oct 08 19:12:32 compute-0 nova_compute[117514]:       <model type="virtio"/>
Oct 08 19:12:32 compute-0 nova_compute[117514]:     </video>
Oct 08 19:12:32 compute-0 nova_compute[117514]:     <input type="tablet" bus="usb"/>
Oct 08 19:12:32 compute-0 nova_compute[117514]:     <rng model="virtio">
Oct 08 19:12:32 compute-0 nova_compute[117514]:       <backend model="random">/dev/urandom</backend>
Oct 08 19:12:32 compute-0 nova_compute[117514]:     </rng>
Oct 08 19:12:32 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root"/>
Oct 08 19:12:32 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:12:32 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:12:32 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:12:32 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:12:32 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:12:32 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:12:32 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:12:32 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:12:32 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:12:32 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:12:32 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:12:32 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:12:32 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:12:32 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:12:32 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:12:32 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:12:32 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:12:32 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:12:32 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:12:32 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:12:32 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:12:32 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:12:32 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:12:32 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:12:32 compute-0 nova_compute[117514]:     <controller type="usb" index="0"/>
Oct 08 19:12:32 compute-0 nova_compute[117514]:     <memballoon model="virtio">
Oct 08 19:12:32 compute-0 nova_compute[117514]:       <stats period="10"/>
Oct 08 19:12:32 compute-0 nova_compute[117514]:     </memballoon>
Oct 08 19:12:32 compute-0 nova_compute[117514]:   </devices>
Oct 08 19:12:32 compute-0 nova_compute[117514]: </domain>
Oct 08 19:12:32 compute-0 nova_compute[117514]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Oct 08 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.688 2 DEBUG nova.compute.manager [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Preparing to wait for external event network-vif-plugged-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Oct 08 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.689 2 DEBUG oslo_concurrency.lockutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "9064d639-63f3-422f-a67f-7a4dad8d2182-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.689 2 DEBUG oslo_concurrency.lockutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "9064d639-63f3-422f-a67f-7a4dad8d2182-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.689 2 DEBUG oslo_concurrency.lockutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "9064d639-63f3-422f-a67f-7a4dad8d2182-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.690 2 DEBUG nova.virt.libvirt.vif [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T19:12:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2063989498',display_name='tempest-TestNetworkBasicOps-server-2063989498',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2063989498',id=8,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAHrmLiAZgqvVf5RfnV+fV+6NQU3NOHOaBSGPHYj+myBWF2AbxEIt6spK8FUlXi8r+736xE5lbIw3NTujAKkT/2AVTAI40/9ASURZfXUfcM5xxB2Et9shqsazA/r0h6yOw==',key_name='tempest-TestNetworkBasicOps-2016672130',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-v8tc0ebw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T19:12:28Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=9064d639-63f3-422f-a67f-7a4dad8d2182,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "address": "fa:16:3e:78:c5:0e", "network": {"id": "9bec1de5-8be3-4df6-b90a-943d76fedc48", "bridge": "br-int", "label": "tempest-network-smoke--1142604311", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b1cf032-8f", "ovs_interfaceid": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Oct 08 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.690 2 DEBUG nova.network.os_vif_util [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "address": "fa:16:3e:78:c5:0e", "network": {"id": "9bec1de5-8be3-4df6-b90a-943d76fedc48", "bridge": "br-int", "label": "tempest-network-smoke--1142604311", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b1cf032-8f", "ovs_interfaceid": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 08 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.691 2 DEBUG nova.network.os_vif_util [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:78:c5:0e,bridge_name='br-int',has_traffic_filtering=True,id=8b1cf032-8f00-4a3b-a370-211b5b0ca4ce,network=Network(9bec1de5-8be3-4df6-b90a-943d76fedc48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8b1cf032-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 08 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.691 2 DEBUG os_vif [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:78:c5:0e,bridge_name='br-int',has_traffic_filtering=True,id=8b1cf032-8f00-4a3b-a370-211b5b0ca4ce,network=Network(9bec1de5-8be3-4df6-b90a-943d76fedc48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8b1cf032-8f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Oct 08 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.692 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.692 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.693 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 08 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.695 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.695 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8b1cf032-8f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.695 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8b1cf032-8f, col_values=(('external_ids', {'iface-id': '8b1cf032-8f00-4a3b-a370-211b5b0ca4ce', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:78:c5:0e', 'vm-uuid': '9064d639-63f3-422f-a67f-7a4dad8d2182'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.697 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:32 compute-0 NetworkManager[1035]: <info>  [1759950752.6979] manager: (tap8b1cf032-8f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/64)
Oct 08 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.701 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 08 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.706 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.709 2 INFO os_vif [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:78:c5:0e,bridge_name='br-int',has_traffic_filtering=True,id=8b1cf032-8f00-4a3b-a370-211b5b0ca4ce,network=Network(9bec1de5-8be3-4df6-b90a-943d76fedc48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8b1cf032-8f')
Oct 08 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.782 2 DEBUG nova.virt.libvirt.driver [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 08 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.782 2 DEBUG nova.virt.libvirt.driver [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 08 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.783 2 DEBUG nova.virt.libvirt.driver [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No VIF found with MAC fa:16:3e:78:c5:0e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Oct 08 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.783 2 INFO nova.virt.libvirt.driver [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Using config drive
Oct 08 19:12:33 compute-0 nova_compute[117514]: 2025-10-08 19:12:33.300 2 INFO nova.virt.libvirt.driver [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Creating config drive at /var/lib/nova/instances/9064d639-63f3-422f-a67f-7a4dad8d2182/disk.config
Oct 08 19:12:33 compute-0 nova_compute[117514]: 2025-10-08 19:12:33.309 2 DEBUG oslo_concurrency.processutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9064d639-63f3-422f-a67f-7a4dad8d2182/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmqm54ocd execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:12:33 compute-0 nova_compute[117514]: 2025-10-08 19:12:33.445 2 DEBUG oslo_concurrency.processutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9064d639-63f3-422f-a67f-7a4dad8d2182/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmqm54ocd" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:12:33 compute-0 kernel: tap8b1cf032-8f: entered promiscuous mode
Oct 08 19:12:33 compute-0 NetworkManager[1035]: <info>  [1759950753.5090] manager: (tap8b1cf032-8f): new Tun device (/org/freedesktop/NetworkManager/Devices/65)
Oct 08 19:12:33 compute-0 nova_compute[117514]: 2025-10-08 19:12:33.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:33 compute-0 ovn_controller[19759]: 2025-10-08T19:12:33Z|00109|binding|INFO|Claiming lport 8b1cf032-8f00-4a3b-a370-211b5b0ca4ce for this chassis.
Oct 08 19:12:33 compute-0 ovn_controller[19759]: 2025-10-08T19:12:33Z|00110|binding|INFO|8b1cf032-8f00-4a3b-a370-211b5b0ca4ce: Claiming fa:16:3e:78:c5:0e 10.100.0.4
Oct 08 19:12:33 compute-0 nova_compute[117514]: 2025-10-08 19:12:33.518 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.532 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:78:c5:0e 10.100.0.4'], port_security=['fa:16:3e:78:c5:0e 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-94131643', 'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '9064d639-63f3-422f-a67f-7a4dad8d2182', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9bec1de5-8be3-4df6-b90a-943d76fedc48', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-94131643', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'be57f10c-6afc-483d-a1fa-fab953b8fe3e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1d3717a1-7175-40f6-8720-19b5f1d50c4f, chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=8b1cf032-8f00-4a3b-a370-211b5b0ca4ce) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.533 28643 INFO neutron.agent.ovn.metadata.agent [-] Port 8b1cf032-8f00-4a3b-a370-211b5b0ca4ce in datapath 9bec1de5-8be3-4df6-b90a-943d76fedc48 bound to our chassis
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.534 28643 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9bec1de5-8be3-4df6-b90a-943d76fedc48
Oct 08 19:12:33 compute-0 systemd-udevd[148732]: Network interface NamePolicy= disabled on kernel command line.
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.549 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[aeb45690-ef70-41e0-b164-bbb5d3fcee3e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.550 28643 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9bec1de5-81 in ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Oct 08 19:12:33 compute-0 systemd-machined[77568]: New machine qemu-8-instance-00000008.
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.559 144726 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9bec1de5-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.559 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[63d1c861-d141-4e44-852b-3bab02309076]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.560 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[008dd50a-9519-40e7-90ad-6622bc285207]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:12:33 compute-0 NetworkManager[1035]: <info>  [1759950753.5684] device (tap8b1cf032-8f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 08 19:12:33 compute-0 NetworkManager[1035]: <info>  [1759950753.5693] device (tap8b1cf032-8f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 08 19:12:33 compute-0 ovn_controller[19759]: 2025-10-08T19:12:33Z|00111|binding|INFO|Setting lport 8b1cf032-8f00-4a3b-a370-211b5b0ca4ce ovn-installed in OVS
Oct 08 19:12:33 compute-0 ovn_controller[19759]: 2025-10-08T19:12:33Z|00112|binding|INFO|Setting lport 8b1cf032-8f00-4a3b-a370-211b5b0ca4ce up in Southbound
Oct 08 19:12:33 compute-0 systemd[1]: Started Virtual Machine qemu-8-instance-00000008.
Oct 08 19:12:33 compute-0 nova_compute[117514]: 2025-10-08 19:12:33.575 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.576 28783 DEBUG oslo.privsep.daemon [-] privsep: reply[e7e6ae86-d9fc-4a23-9146-06b03f2bf117]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.594 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[ef14f539-b71d-4e95-b0a4-e5cd732e1311]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.630 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[c0bdc9b1-5747-4cf2-ae9c-b7cbc8e658eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.638 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[dacd09f6-344b-44c9-b220-d876cedff3d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:12:33 compute-0 NetworkManager[1035]: <info>  [1759950753.6392] manager: (tap9bec1de5-80): new Veth device (/org/freedesktop/NetworkManager/Devices/66)
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.679 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[e1b2cfef-e96d-4400-892d-927616bd7a4e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.684 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[28b28539-a78a-4cb4-a0f6-400f6d2e35ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:12:33 compute-0 NetworkManager[1035]: <info>  [1759950753.7151] device (tap9bec1de5-80): carrier: link connected
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.722 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[fe335248-3986-43e5-a4da-00acbea1de26]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.746 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[9579fe01-10ae-4f6d-a543-4122cfae2b16]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9bec1de5-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e5:ad:60'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 142429, 'reachable_time': 23969, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 148766, 'error': None, 'target': 'ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.766 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[7cd07b92-8e2d-450b-95f2-08ed663d3969]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee5:ad60'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 142429, 'tstamp': 142429}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 148767, 'error': None, 'target': 'ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.789 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[52677218-a51f-42dc-81ae-dac36394f6a7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9bec1de5-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e5:ad:60'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 142429, 'reachable_time': 23969, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 148768, 'error': None, 'target': 'ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:12:33 compute-0 nova_compute[117514]: 2025-10-08 19:12:33.796 2 DEBUG nova.compute.manager [req-0ce651f1-aed1-4de0-9c42-7bc1cb315a1e req-56478a7b-2765-4d72-841d-79e1ef4148ac bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Received event network-vif-plugged-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:12:33 compute-0 nova_compute[117514]: 2025-10-08 19:12:33.796 2 DEBUG oslo_concurrency.lockutils [req-0ce651f1-aed1-4de0-9c42-7bc1cb315a1e req-56478a7b-2765-4d72-841d-79e1ef4148ac bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "9064d639-63f3-422f-a67f-7a4dad8d2182-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:12:33 compute-0 nova_compute[117514]: 2025-10-08 19:12:33.797 2 DEBUG oslo_concurrency.lockutils [req-0ce651f1-aed1-4de0-9c42-7bc1cb315a1e req-56478a7b-2765-4d72-841d-79e1ef4148ac bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "9064d639-63f3-422f-a67f-7a4dad8d2182-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:12:33 compute-0 nova_compute[117514]: 2025-10-08 19:12:33.797 2 DEBUG oslo_concurrency.lockutils [req-0ce651f1-aed1-4de0-9c42-7bc1cb315a1e req-56478a7b-2765-4d72-841d-79e1ef4148ac bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "9064d639-63f3-422f-a67f-7a4dad8d2182-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:12:33 compute-0 nova_compute[117514]: 2025-10-08 19:12:33.797 2 DEBUG nova.compute.manager [req-0ce651f1-aed1-4de0-9c42-7bc1cb315a1e req-56478a7b-2765-4d72-841d-79e1ef4148ac bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Processing event network-vif-plugged-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.841 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[12caad1a-e216-4e4f-ad0a-25bfdf2e3331]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.931 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[6c9e2602-907e-4673-bb62-a2ec470764ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.933 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9bec1de5-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.933 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.934 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9bec1de5-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:12:33 compute-0 kernel: tap9bec1de5-80: entered promiscuous mode
Oct 08 19:12:33 compute-0 nova_compute[117514]: 2025-10-08 19:12:33.936 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:33 compute-0 NetworkManager[1035]: <info>  [1759950753.9367] manager: (tap9bec1de5-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/67)
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.942 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9bec1de5-80, col_values=(('external_ids', {'iface-id': 'b57e5c57-68fb-43de-8f87-98a853dc8be7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:12:33 compute-0 ovn_controller[19759]: 2025-10-08T19:12:33Z|00113|binding|INFO|Releasing lport b57e5c57-68fb-43de-8f87-98a853dc8be7 from this chassis (sb_readonly=0)
Oct 08 19:12:33 compute-0 nova_compute[117514]: 2025-10-08 19:12:33.943 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.944 28643 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9bec1de5-8be3-4df6-b90a-943d76fedc48.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9bec1de5-8be3-4df6-b90a-943d76fedc48.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.945 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[61394c4e-c729-4daf-8f54-2c1c4d8d8f39]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.946 28643 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]: global
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]:     log         /dev/log local0 debug
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]:     log-tag     haproxy-metadata-proxy-9bec1de5-8be3-4df6-b90a-943d76fedc48
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]:     user        root
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]:     group       root
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]:     maxconn     1024
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]:     pidfile     /var/lib/neutron/external/pids/9bec1de5-8be3-4df6-b90a-943d76fedc48.pid.haproxy
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]:     daemon
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]: 
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]: defaults
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]:     log global
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]:     mode http
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]:     option httplog
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]:     option dontlognull
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]:     option http-server-close
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]:     option forwardfor
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]:     retries                 3
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]:     timeout http-request    30s
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]:     timeout connect         30s
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]:     timeout client          32s
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]:     timeout server          32s
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]:     timeout http-keep-alive 30s
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]: 
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]: 
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]: listen listener
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]:     bind 169.254.169.254:80
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]:     server metadata /var/lib/neutron/metadata_proxy
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]:     http-request add-header X-OVN-Network-ID 9bec1de5-8be3-4df6-b90a-943d76fedc48
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Oct 08 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.947 28643 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48', 'env', 'PROCESS_TAG=haproxy-9bec1de5-8be3-4df6-b90a-943d76fedc48', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9bec1de5-8be3-4df6-b90a-943d76fedc48.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Oct 08 19:12:33 compute-0 nova_compute[117514]: 2025-10-08 19:12:33.955 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:34 compute-0 nova_compute[117514]: 2025-10-08 19:12:34.282 2 DEBUG nova.network.neutron [req-cb2ce92c-f7b2-40ba-a5c9-6eed785f55e5 req-1705a742-acbb-4a77-8261-c18cc1ae9158 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Updated VIF entry in instance network info cache for port 8b1cf032-8f00-4a3b-a370-211b5b0ca4ce. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 08 19:12:34 compute-0 nova_compute[117514]: 2025-10-08 19:12:34.282 2 DEBUG nova.network.neutron [req-cb2ce92c-f7b2-40ba-a5c9-6eed785f55e5 req-1705a742-acbb-4a77-8261-c18cc1ae9158 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Updating instance_info_cache with network_info: [{"id": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "address": "fa:16:3e:78:c5:0e", "network": {"id": "9bec1de5-8be3-4df6-b90a-943d76fedc48", "bridge": "br-int", "label": "tempest-network-smoke--1142604311", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b1cf032-8f", "ovs_interfaceid": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:12:34 compute-0 podman[148800]: 2025-10-08 19:12:34.292727842 +0000 UTC m=+0.056915489 container create ba509af1b1a2aacf4afdbcc1de97c2dadc543be61aff900e5b38b6bdc5419987 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 08 19:12:34 compute-0 nova_compute[117514]: 2025-10-08 19:12:34.296 2 DEBUG oslo_concurrency.lockutils [req-cb2ce92c-f7b2-40ba-a5c9-6eed785f55e5 req-1705a742-acbb-4a77-8261-c18cc1ae9158 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-9064d639-63f3-422f-a67f-7a4dad8d2182" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:12:34 compute-0 systemd[1]: Started libpod-conmon-ba509af1b1a2aacf4afdbcc1de97c2dadc543be61aff900e5b38b6bdc5419987.scope.
Oct 08 19:12:34 compute-0 systemd[1]: Started libcrun container.
Oct 08 19:12:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fbecf7fadeecdbbbc2fa0c43982a21d5c67a25c8cf877db7615c2f2cd48a1b6c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 08 19:12:34 compute-0 podman[148800]: 2025-10-08 19:12:34.265274317 +0000 UTC m=+0.029462014 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7
Oct 08 19:12:34 compute-0 podman[148800]: 2025-10-08 19:12:34.371208713 +0000 UTC m=+0.135396400 container init ba509af1b1a2aacf4afdbcc1de97c2dadc543be61aff900e5b38b6bdc5419987 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 08 19:12:34 compute-0 podman[148800]: 2025-10-08 19:12:34.37802458 +0000 UTC m=+0.142212277 container start ba509af1b1a2aacf4afdbcc1de97c2dadc543be61aff900e5b38b6bdc5419987 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 08 19:12:34 compute-0 podman[148813]: 2025-10-08 19:12:34.396077012 +0000 UTC m=+0.059239965 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct 08 19:12:34 compute-0 neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48[148818]: [NOTICE]   (148842) : New worker (148852) forked
Oct 08 19:12:34 compute-0 neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48[148818]: [NOTICE]   (148842) : Loading success.
Oct 08 19:12:34 compute-0 nova_compute[117514]: 2025-10-08 19:12:34.844 2 DEBUG nova.compute.manager [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Oct 08 19:12:34 compute-0 nova_compute[117514]: 2025-10-08 19:12:34.846 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950754.8441575, 9064d639-63f3-422f-a67f-7a4dad8d2182 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 08 19:12:34 compute-0 nova_compute[117514]: 2025-10-08 19:12:34.847 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] VM Started (Lifecycle Event)
Oct 08 19:12:34 compute-0 nova_compute[117514]: 2025-10-08 19:12:34.854 2 DEBUG nova.virt.libvirt.driver [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Oct 08 19:12:34 compute-0 nova_compute[117514]: 2025-10-08 19:12:34.858 2 INFO nova.virt.libvirt.driver [-] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Instance spawned successfully.
Oct 08 19:12:34 compute-0 nova_compute[117514]: 2025-10-08 19:12:34.859 2 DEBUG nova.virt.libvirt.driver [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Oct 08 19:12:34 compute-0 nova_compute[117514]: 2025-10-08 19:12:34.870 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:12:34 compute-0 nova_compute[117514]: 2025-10-08 19:12:34.874 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 08 19:12:34 compute-0 nova_compute[117514]: 2025-10-08 19:12:34.885 2 DEBUG nova.virt.libvirt.driver [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:12:34 compute-0 nova_compute[117514]: 2025-10-08 19:12:34.885 2 DEBUG nova.virt.libvirt.driver [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:12:34 compute-0 nova_compute[117514]: 2025-10-08 19:12:34.886 2 DEBUG nova.virt.libvirt.driver [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:12:34 compute-0 nova_compute[117514]: 2025-10-08 19:12:34.886 2 DEBUG nova.virt.libvirt.driver [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:12:34 compute-0 nova_compute[117514]: 2025-10-08 19:12:34.887 2 DEBUG nova.virt.libvirt.driver [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:12:34 compute-0 nova_compute[117514]: 2025-10-08 19:12:34.887 2 DEBUG nova.virt.libvirt.driver [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:12:34 compute-0 nova_compute[117514]: 2025-10-08 19:12:34.895 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 08 19:12:34 compute-0 nova_compute[117514]: 2025-10-08 19:12:34.895 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950754.844457, 9064d639-63f3-422f-a67f-7a4dad8d2182 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 08 19:12:34 compute-0 nova_compute[117514]: 2025-10-08 19:12:34.896 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] VM Paused (Lifecycle Event)
Oct 08 19:12:34 compute-0 nova_compute[117514]: 2025-10-08 19:12:34.945 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:12:34 compute-0 nova_compute[117514]: 2025-10-08 19:12:34.950 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950754.8507743, 9064d639-63f3-422f-a67f-7a4dad8d2182 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 08 19:12:34 compute-0 nova_compute[117514]: 2025-10-08 19:12:34.950 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] VM Resumed (Lifecycle Event)
Oct 08 19:12:34 compute-0 nova_compute[117514]: 2025-10-08 19:12:34.978 2 INFO nova.compute.manager [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Took 6.34 seconds to spawn the instance on the hypervisor.
Oct 08 19:12:34 compute-0 nova_compute[117514]: 2025-10-08 19:12:34.978 2 DEBUG nova.compute.manager [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:12:34 compute-0 nova_compute[117514]: 2025-10-08 19:12:34.980 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:12:34 compute-0 nova_compute[117514]: 2025-10-08 19:12:34.987 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 08 19:12:35 compute-0 nova_compute[117514]: 2025-10-08 19:12:35.035 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 08 19:12:35 compute-0 nova_compute[117514]: 2025-10-08 19:12:35.062 2 INFO nova.compute.manager [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Took 6.84 seconds to build instance.
Oct 08 19:12:35 compute-0 nova_compute[117514]: 2025-10-08 19:12:35.079 2 DEBUG oslo_concurrency.lockutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "9064d639-63f3-422f-a67f-7a4dad8d2182" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.937s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:12:35 compute-0 nova_compute[117514]: 2025-10-08 19:12:35.891 2 DEBUG nova.compute.manager [req-ca939b36-bf91-4426-b3ea-e6a993bfc236 req-70b82259-141b-487a-947f-76dde449f332 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Received event network-vif-plugged-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:12:35 compute-0 nova_compute[117514]: 2025-10-08 19:12:35.892 2 DEBUG oslo_concurrency.lockutils [req-ca939b36-bf91-4426-b3ea-e6a993bfc236 req-70b82259-141b-487a-947f-76dde449f332 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "9064d639-63f3-422f-a67f-7a4dad8d2182-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:12:35 compute-0 nova_compute[117514]: 2025-10-08 19:12:35.892 2 DEBUG oslo_concurrency.lockutils [req-ca939b36-bf91-4426-b3ea-e6a993bfc236 req-70b82259-141b-487a-947f-76dde449f332 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "9064d639-63f3-422f-a67f-7a4dad8d2182-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:12:35 compute-0 nova_compute[117514]: 2025-10-08 19:12:35.892 2 DEBUG oslo_concurrency.lockutils [req-ca939b36-bf91-4426-b3ea-e6a993bfc236 req-70b82259-141b-487a-947f-76dde449f332 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "9064d639-63f3-422f-a67f-7a4dad8d2182-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:12:35 compute-0 nova_compute[117514]: 2025-10-08 19:12:35.893 2 DEBUG nova.compute.manager [req-ca939b36-bf91-4426-b3ea-e6a993bfc236 req-70b82259-141b-487a-947f-76dde449f332 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] No waiting events found dispatching network-vif-plugged-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 08 19:12:35 compute-0 nova_compute[117514]: 2025-10-08 19:12:35.893 2 WARNING nova.compute.manager [req-ca939b36-bf91-4426-b3ea-e6a993bfc236 req-70b82259-141b-487a-947f-76dde449f332 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Received unexpected event network-vif-plugged-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce for instance with vm_state active and task_state None.
Oct 08 19:12:36 compute-0 nova_compute[117514]: 2025-10-08 19:12:36.181 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:37 compute-0 podman[148862]: 2025-10-08 19:12:37.647807593 +0000 UTC m=+0.063178739 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 08 19:12:37 compute-0 podman[148863]: 2025-10-08 19:12:37.6715654 +0000 UTC m=+0.085231017 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible)
Oct 08 19:12:37 compute-0 podman[148864]: 2025-10-08 19:12:37.680287763 +0000 UTC m=+0.084911728 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 08 19:12:37 compute-0 nova_compute[117514]: 2025-10-08 19:12:37.697 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:38 compute-0 ovn_controller[19759]: 2025-10-08T19:12:38Z|00114|binding|INFO|Releasing lport b57e5c57-68fb-43de-8f87-98a853dc8be7 from this chassis (sb_readonly=0)
Oct 08 19:12:38 compute-0 nova_compute[117514]: 2025-10-08 19:12:38.491 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:38 compute-0 NetworkManager[1035]: <info>  [1759950758.4938] manager: (patch-provnet-64c51c9c-a066-44c7-bc3d-9c8bcfc2a465-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/68)
Oct 08 19:12:38 compute-0 NetworkManager[1035]: <info>  [1759950758.4950] manager: (patch-br-int-to-provnet-64c51c9c-a066-44c7-bc3d-9c8bcfc2a465): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/69)
Oct 08 19:12:38 compute-0 ovn_controller[19759]: 2025-10-08T19:12:38Z|00115|binding|INFO|Releasing lport b57e5c57-68fb-43de-8f87-98a853dc8be7 from this chassis (sb_readonly=0)
Oct 08 19:12:38 compute-0 nova_compute[117514]: 2025-10-08 19:12:38.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:38 compute-0 nova_compute[117514]: 2025-10-08 19:12:38.555 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:38 compute-0 nova_compute[117514]: 2025-10-08 19:12:38.766 2 DEBUG nova.compute.manager [req-e555e061-1370-4fc8-8d40-990836091950 req-af444cc9-8c48-4315-b507-317241b7a6da bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Received event network-changed-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:12:38 compute-0 nova_compute[117514]: 2025-10-08 19:12:38.767 2 DEBUG nova.compute.manager [req-e555e061-1370-4fc8-8d40-990836091950 req-af444cc9-8c48-4315-b507-317241b7a6da bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Refreshing instance network info cache due to event network-changed-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 08 19:12:38 compute-0 nova_compute[117514]: 2025-10-08 19:12:38.768 2 DEBUG oslo_concurrency.lockutils [req-e555e061-1370-4fc8-8d40-990836091950 req-af444cc9-8c48-4315-b507-317241b7a6da bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-9064d639-63f3-422f-a67f-7a4dad8d2182" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:12:38 compute-0 nova_compute[117514]: 2025-10-08 19:12:38.768 2 DEBUG oslo_concurrency.lockutils [req-e555e061-1370-4fc8-8d40-990836091950 req-af444cc9-8c48-4315-b507-317241b7a6da bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-9064d639-63f3-422f-a67f-7a4dad8d2182" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:12:38 compute-0 nova_compute[117514]: 2025-10-08 19:12:38.768 2 DEBUG nova.network.neutron [req-e555e061-1370-4fc8-8d40-990836091950 req-af444cc9-8c48-4315-b507-317241b7a6da bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Refreshing network info cache for port 8b1cf032-8f00-4a3b-a370-211b5b0ca4ce _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 08 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.300 2 DEBUG oslo_concurrency.lockutils [None req-60f80fbc-66cb-4655-b02c-26b1c76aa351 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "9064d639-63f3-422f-a67f-7a4dad8d2182" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.301 2 DEBUG oslo_concurrency.lockutils [None req-60f80fbc-66cb-4655-b02c-26b1c76aa351 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "9064d639-63f3-422f-a67f-7a4dad8d2182" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.302 2 DEBUG oslo_concurrency.lockutils [None req-60f80fbc-66cb-4655-b02c-26b1c76aa351 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "9064d639-63f3-422f-a67f-7a4dad8d2182-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.302 2 DEBUG oslo_concurrency.lockutils [None req-60f80fbc-66cb-4655-b02c-26b1c76aa351 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "9064d639-63f3-422f-a67f-7a4dad8d2182-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.302 2 DEBUG oslo_concurrency.lockutils [None req-60f80fbc-66cb-4655-b02c-26b1c76aa351 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "9064d639-63f3-422f-a67f-7a4dad8d2182-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.304 2 INFO nova.compute.manager [None req-60f80fbc-66cb-4655-b02c-26b1c76aa351 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Terminating instance
Oct 08 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.305 2 DEBUG nova.compute.manager [None req-60f80fbc-66cb-4655-b02c-26b1c76aa351 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Oct 08 19:12:39 compute-0 kernel: tap8b1cf032-8f (unregistering): left promiscuous mode
Oct 08 19:12:39 compute-0 NetworkManager[1035]: <info>  [1759950759.3295] device (tap8b1cf032-8f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 08 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.388 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:39 compute-0 ovn_controller[19759]: 2025-10-08T19:12:39Z|00116|binding|INFO|Releasing lport 8b1cf032-8f00-4a3b-a370-211b5b0ca4ce from this chassis (sb_readonly=0)
Oct 08 19:12:39 compute-0 ovn_controller[19759]: 2025-10-08T19:12:39Z|00117|binding|INFO|Setting lport 8b1cf032-8f00-4a3b-a370-211b5b0ca4ce down in Southbound
Oct 08 19:12:39 compute-0 ovn_controller[19759]: 2025-10-08T19:12:39Z|00118|binding|INFO|Removing iface tap8b1cf032-8f ovn-installed in OVS
Oct 08 19:12:39 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:39.393 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:78:c5:0e 10.100.0.4'], port_security=['fa:16:3e:78:c5:0e 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-94131643', 'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '9064d639-63f3-422f-a67f-7a4dad8d2182', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9bec1de5-8be3-4df6-b90a-943d76fedc48', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-94131643', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'be57f10c-6afc-483d-a1fa-fab953b8fe3e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.175'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1d3717a1-7175-40f6-8720-19b5f1d50c4f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=8b1cf032-8f00-4a3b-a370-211b5b0ca4ce) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 08 19:12:39 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:39.394 28643 INFO neutron.agent.ovn.metadata.agent [-] Port 8b1cf032-8f00-4a3b-a370-211b5b0ca4ce in datapath 9bec1de5-8be3-4df6-b90a-943d76fedc48 unbound from our chassis
Oct 08 19:12:39 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:39.395 28643 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9bec1de5-8be3-4df6-b90a-943d76fedc48, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 08 19:12:39 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:39.396 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[0d8a0b3e-8156-456f-b15f-930cc05fda6f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:12:39 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:39.396 28643 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48 namespace which is not needed anymore
Oct 08 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:39 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000008.scope: Deactivated successfully.
Oct 08 19:12:39 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000008.scope: Consumed 5.730s CPU time.
Oct 08 19:12:39 compute-0 systemd-machined[77568]: Machine qemu-8-instance-00000008 terminated.
Oct 08 19:12:39 compute-0 neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48[148818]: [NOTICE]   (148842) : haproxy version is 2.8.14-c23fe91
Oct 08 19:12:39 compute-0 neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48[148818]: [NOTICE]   (148842) : path to executable is /usr/sbin/haproxy
Oct 08 19:12:39 compute-0 neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48[148818]: [WARNING]  (148842) : Exiting Master process...
Oct 08 19:12:39 compute-0 neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48[148818]: [ALERT]    (148842) : Current worker (148852) exited with code 143 (Terminated)
Oct 08 19:12:39 compute-0 neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48[148818]: [WARNING]  (148842) : All workers exited. Exiting... (0)
Oct 08 19:12:39 compute-0 systemd[1]: libpod-ba509af1b1a2aacf4afdbcc1de97c2dadc543be61aff900e5b38b6bdc5419987.scope: Deactivated successfully.
Oct 08 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.531 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.533 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:39 compute-0 podman[148949]: 2025-10-08 19:12:39.539423055 +0000 UTC m=+0.052623744 container died ba509af1b1a2aacf4afdbcc1de97c2dadc543be61aff900e5b38b6bdc5419987 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 08 19:12:39 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ba509af1b1a2aacf4afdbcc1de97c2dadc543be61aff900e5b38b6bdc5419987-userdata-shm.mount: Deactivated successfully.
Oct 08 19:12:39 compute-0 systemd[1]: var-lib-containers-storage-overlay-fbecf7fadeecdbbbc2fa0c43982a21d5c67a25c8cf877db7615c2f2cd48a1b6c-merged.mount: Deactivated successfully.
Oct 08 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.570 2 INFO nova.virt.libvirt.driver [-] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Instance destroyed successfully.
Oct 08 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.571 2 DEBUG nova.objects.instance [None req-60f80fbc-66cb-4655-b02c-26b1c76aa351 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'resources' on Instance uuid 9064d639-63f3-422f-a67f-7a4dad8d2182 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 08 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.586 2 DEBUG nova.virt.libvirt.vif [None req-60f80fbc-66cb-4655-b02c-26b1c76aa351 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T19:12:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2063989498',display_name='tempest-TestNetworkBasicOps-server-2063989498',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2063989498',id=8,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAHrmLiAZgqvVf5RfnV+fV+6NQU3NOHOaBSGPHYj+myBWF2AbxEIt6spK8FUlXi8r+736xE5lbIw3NTujAKkT/2AVTAI40/9ASURZfXUfcM5xxB2Et9shqsazA/r0h6yOw==',key_name='tempest-TestNetworkBasicOps-2016672130',keypairs=<?>,launch_index=0,launched_at=2025-10-08T19:12:34Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-v8tc0ebw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T19:12:35Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=9064d639-63f3-422f-a67f-7a4dad8d2182,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "address": "fa:16:3e:78:c5:0e", "network": {"id": "9bec1de5-8be3-4df6-b90a-943d76fedc48", "bridge": "br-int", "label": "tempest-network-smoke--1142604311", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b1cf032-8f", "ovs_interfaceid": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Oct 08 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.586 2 DEBUG nova.network.os_vif_util [None req-60f80fbc-66cb-4655-b02c-26b1c76aa351 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "address": "fa:16:3e:78:c5:0e", "network": {"id": "9bec1de5-8be3-4df6-b90a-943d76fedc48", "bridge": "br-int", "label": "tempest-network-smoke--1142604311", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b1cf032-8f", "ovs_interfaceid": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 08 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.587 2 DEBUG nova.network.os_vif_util [None req-60f80fbc-66cb-4655-b02c-26b1c76aa351 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:78:c5:0e,bridge_name='br-int',has_traffic_filtering=True,id=8b1cf032-8f00-4a3b-a370-211b5b0ca4ce,network=Network(9bec1de5-8be3-4df6-b90a-943d76fedc48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8b1cf032-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 08 19:12:39 compute-0 podman[148949]: 2025-10-08 19:12:39.588075793 +0000 UTC m=+0.101276482 container cleanup ba509af1b1a2aacf4afdbcc1de97c2dadc543be61aff900e5b38b6bdc5419987 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 08 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.588 2 DEBUG os_vif [None req-60f80fbc-66cb-4655-b02c-26b1c76aa351 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:78:c5:0e,bridge_name='br-int',has_traffic_filtering=True,id=8b1cf032-8f00-4a3b-a370-211b5b0ca4ce,network=Network(9bec1de5-8be3-4df6-b90a-943d76fedc48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8b1cf032-8f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Oct 08 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.591 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.591 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8b1cf032-8f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.594 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:39 compute-0 systemd[1]: libpod-conmon-ba509af1b1a2aacf4afdbcc1de97c2dadc543be61aff900e5b38b6bdc5419987.scope: Deactivated successfully.
Oct 08 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 08 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.600 2 INFO os_vif [None req-60f80fbc-66cb-4655-b02c-26b1c76aa351 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:78:c5:0e,bridge_name='br-int',has_traffic_filtering=True,id=8b1cf032-8f00-4a3b-a370-211b5b0ca4ce,network=Network(9bec1de5-8be3-4df6-b90a-943d76fedc48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8b1cf032-8f')
Oct 08 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.601 2 INFO nova.virt.libvirt.driver [None req-60f80fbc-66cb-4655-b02c-26b1c76aa351 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Deleting instance files /var/lib/nova/instances/9064d639-63f3-422f-a67f-7a4dad8d2182_del
Oct 08 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.601 2 INFO nova.virt.libvirt.driver [None req-60f80fbc-66cb-4655-b02c-26b1c76aa351 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Deletion of /var/lib/nova/instances/9064d639-63f3-422f-a67f-7a4dad8d2182_del complete
Oct 08 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.644 2 INFO nova.compute.manager [None req-60f80fbc-66cb-4655-b02c-26b1c76aa351 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Took 0.34 seconds to destroy the instance on the hypervisor.
Oct 08 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.644 2 DEBUG oslo.service.loopingcall [None req-60f80fbc-66cb-4655-b02c-26b1c76aa351 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Oct 08 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.645 2 DEBUG nova.compute.manager [-] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Oct 08 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.645 2 DEBUG nova.network.neutron [-] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Oct 08 19:12:39 compute-0 podman[148995]: 2025-10-08 19:12:39.661173768 +0000 UTC m=+0.049059620 container remove ba509af1b1a2aacf4afdbcc1de97c2dadc543be61aff900e5b38b6bdc5419987 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct 08 19:12:39 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:39.666 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[ce18e400-42f9-4b06-9ba0-13c0b339c46b]: (4, ('Wed Oct  8 07:12:39 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48 (ba509af1b1a2aacf4afdbcc1de97c2dadc543be61aff900e5b38b6bdc5419987)\nba509af1b1a2aacf4afdbcc1de97c2dadc543be61aff900e5b38b6bdc5419987\nWed Oct  8 07:12:39 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48 (ba509af1b1a2aacf4afdbcc1de97c2dadc543be61aff900e5b38b6bdc5419987)\nba509af1b1a2aacf4afdbcc1de97c2dadc543be61aff900e5b38b6bdc5419987\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:12:39 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:39.668 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[9422a892-d616-4c92-8bec-fdc24d645473]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:12:39 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:39.669 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9bec1de5-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:39 compute-0 kernel: tap9bec1de5-80: left promiscuous mode
Oct 08 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.682 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:39 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:39.687 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[879ca788-7422-4c25-be1d-8c78205272e0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:12:39 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:39.714 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[5ba0e8d3-3dca-4473-9a50-8e2b9a8ee08e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:12:39 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:39.716 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[be756035-e6cb-48c5-adff-fa6948d2ad0e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:12:39 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:39.733 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[e0e9d498-e4f9-4d4d-a8a9-cd8d47add5b6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 142420, 'reachable_time': 41214, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 149008, 'error': None, 'target': 'ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:12:39 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:39.736 28783 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Oct 08 19:12:39 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:39.736 28783 DEBUG oslo.privsep.daemon [-] privsep: reply[46ab0e75-87fa-46d8-9aee-b7d3674d2e04]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:12:39 compute-0 systemd[1]: run-netns-ovnmeta\x2d9bec1de5\x2d8be3\x2d4df6\x2db90a\x2d943d76fedc48.mount: Deactivated successfully.
Oct 08 19:12:40 compute-0 nova_compute[117514]: 2025-10-08 19:12:40.516 2 DEBUG nova.network.neutron [req-e555e061-1370-4fc8-8d40-990836091950 req-af444cc9-8c48-4315-b507-317241b7a6da bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Updated VIF entry in instance network info cache for port 8b1cf032-8f00-4a3b-a370-211b5b0ca4ce. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 08 19:12:40 compute-0 nova_compute[117514]: 2025-10-08 19:12:40.517 2 DEBUG nova.network.neutron [req-e555e061-1370-4fc8-8d40-990836091950 req-af444cc9-8c48-4315-b507-317241b7a6da bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Updating instance_info_cache with network_info: [{"id": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "address": "fa:16:3e:78:c5:0e", "network": {"id": "9bec1de5-8be3-4df6-b90a-943d76fedc48", "bridge": "br-int", "label": "tempest-network-smoke--1142604311", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b1cf032-8f", "ovs_interfaceid": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:12:40 compute-0 nova_compute[117514]: 2025-10-08 19:12:40.535 2 DEBUG oslo_concurrency.lockutils [req-e555e061-1370-4fc8-8d40-990836091950 req-af444cc9-8c48-4315-b507-317241b7a6da bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-9064d639-63f3-422f-a67f-7a4dad8d2182" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:12:40 compute-0 nova_compute[117514]: 2025-10-08 19:12:40.863 2 DEBUG nova.compute.manager [req-c3545de3-01b2-499f-b8ec-abcc85d9d8f8 req-cc76a44f-b701-4ddb-8780-7233c49c4a62 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Received event network-vif-unplugged-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:12:40 compute-0 nova_compute[117514]: 2025-10-08 19:12:40.864 2 DEBUG oslo_concurrency.lockutils [req-c3545de3-01b2-499f-b8ec-abcc85d9d8f8 req-cc76a44f-b701-4ddb-8780-7233c49c4a62 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "9064d639-63f3-422f-a67f-7a4dad8d2182-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:12:40 compute-0 nova_compute[117514]: 2025-10-08 19:12:40.864 2 DEBUG oslo_concurrency.lockutils [req-c3545de3-01b2-499f-b8ec-abcc85d9d8f8 req-cc76a44f-b701-4ddb-8780-7233c49c4a62 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "9064d639-63f3-422f-a67f-7a4dad8d2182-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:12:40 compute-0 nova_compute[117514]: 2025-10-08 19:12:40.864 2 DEBUG oslo_concurrency.lockutils [req-c3545de3-01b2-499f-b8ec-abcc85d9d8f8 req-cc76a44f-b701-4ddb-8780-7233c49c4a62 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "9064d639-63f3-422f-a67f-7a4dad8d2182-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:12:40 compute-0 nova_compute[117514]: 2025-10-08 19:12:40.864 2 DEBUG nova.compute.manager [req-c3545de3-01b2-499f-b8ec-abcc85d9d8f8 req-cc76a44f-b701-4ddb-8780-7233c49c4a62 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] No waiting events found dispatching network-vif-unplugged-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 08 19:12:40 compute-0 nova_compute[117514]: 2025-10-08 19:12:40.865 2 DEBUG nova.compute.manager [req-c3545de3-01b2-499f-b8ec-abcc85d9d8f8 req-cc76a44f-b701-4ddb-8780-7233c49c4a62 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Received event network-vif-unplugged-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Oct 08 19:12:40 compute-0 nova_compute[117514]: 2025-10-08 19:12:40.865 2 DEBUG nova.compute.manager [req-c3545de3-01b2-499f-b8ec-abcc85d9d8f8 req-cc76a44f-b701-4ddb-8780-7233c49c4a62 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Received event network-vif-plugged-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:12:40 compute-0 nova_compute[117514]: 2025-10-08 19:12:40.865 2 DEBUG oslo_concurrency.lockutils [req-c3545de3-01b2-499f-b8ec-abcc85d9d8f8 req-cc76a44f-b701-4ddb-8780-7233c49c4a62 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "9064d639-63f3-422f-a67f-7a4dad8d2182-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:12:40 compute-0 nova_compute[117514]: 2025-10-08 19:12:40.865 2 DEBUG oslo_concurrency.lockutils [req-c3545de3-01b2-499f-b8ec-abcc85d9d8f8 req-cc76a44f-b701-4ddb-8780-7233c49c4a62 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "9064d639-63f3-422f-a67f-7a4dad8d2182-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:12:40 compute-0 nova_compute[117514]: 2025-10-08 19:12:40.866 2 DEBUG oslo_concurrency.lockutils [req-c3545de3-01b2-499f-b8ec-abcc85d9d8f8 req-cc76a44f-b701-4ddb-8780-7233c49c4a62 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "9064d639-63f3-422f-a67f-7a4dad8d2182-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:12:40 compute-0 nova_compute[117514]: 2025-10-08 19:12:40.866 2 DEBUG nova.compute.manager [req-c3545de3-01b2-499f-b8ec-abcc85d9d8f8 req-cc76a44f-b701-4ddb-8780-7233c49c4a62 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] No waiting events found dispatching network-vif-plugged-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 08 19:12:40 compute-0 nova_compute[117514]: 2025-10-08 19:12:40.866 2 WARNING nova.compute.manager [req-c3545de3-01b2-499f-b8ec-abcc85d9d8f8 req-cc76a44f-b701-4ddb-8780-7233c49c4a62 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Received unexpected event network-vif-plugged-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce for instance with vm_state active and task_state deleting.
Oct 08 19:12:41 compute-0 nova_compute[117514]: 2025-10-08 19:12:41.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:41 compute-0 nova_compute[117514]: 2025-10-08 19:12:41.242 2 DEBUG nova.network.neutron [-] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:12:41 compute-0 nova_compute[117514]: 2025-10-08 19:12:41.263 2 INFO nova.compute.manager [-] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Took 1.62 seconds to deallocate network for instance.
Oct 08 19:12:41 compute-0 nova_compute[117514]: 2025-10-08 19:12:41.307 2 DEBUG oslo_concurrency.lockutils [None req-60f80fbc-66cb-4655-b02c-26b1c76aa351 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:12:41 compute-0 nova_compute[117514]: 2025-10-08 19:12:41.308 2 DEBUG oslo_concurrency.lockutils [None req-60f80fbc-66cb-4655-b02c-26b1c76aa351 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:12:41 compute-0 nova_compute[117514]: 2025-10-08 19:12:41.379 2 DEBUG nova.compute.provider_tree [None req-60f80fbc-66cb-4655-b02c-26b1c76aa351 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 08 19:12:41 compute-0 nova_compute[117514]: 2025-10-08 19:12:41.393 2 DEBUG nova.scheduler.client.report [None req-60f80fbc-66cb-4655-b02c-26b1c76aa351 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 08 19:12:41 compute-0 nova_compute[117514]: 2025-10-08 19:12:41.412 2 DEBUG oslo_concurrency.lockutils [None req-60f80fbc-66cb-4655-b02c-26b1c76aa351 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:12:41 compute-0 nova_compute[117514]: 2025-10-08 19:12:41.436 2 INFO nova.scheduler.client.report [None req-60f80fbc-66cb-4655-b02c-26b1c76aa351 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Deleted allocations for instance 9064d639-63f3-422f-a67f-7a4dad8d2182
Oct 08 19:12:41 compute-0 nova_compute[117514]: 2025-10-08 19:12:41.498 2 DEBUG oslo_concurrency.lockutils [None req-60f80fbc-66cb-4655-b02c-26b1c76aa351 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "9064d639-63f3-422f-a67f-7a4dad8d2182" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.197s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:12:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:44.233 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:12:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:44.234 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:12:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:44.234 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:12:44 compute-0 nova_compute[117514]: 2025-10-08 19:12:44.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:46 compute-0 nova_compute[117514]: 2025-10-08 19:12:46.185 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:46 compute-0 podman[149010]: 2025-10-08 19:12:46.642692007 +0000 UTC m=+0.063698585 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 08 19:12:49 compute-0 nova_compute[117514]: 2025-10-08 19:12:49.647 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:51 compute-0 nova_compute[117514]: 2025-10-08 19:12:51.188 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:53 compute-0 nova_compute[117514]: 2025-10-08 19:12:53.848 2 DEBUG oslo_concurrency.lockutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "093d721c-61cb-4fd3-b678-7465d8840cc6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:12:53 compute-0 nova_compute[117514]: 2025-10-08 19:12:53.848 2 DEBUG oslo_concurrency.lockutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "093d721c-61cb-4fd3-b678-7465d8840cc6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:12:53 compute-0 nova_compute[117514]: 2025-10-08 19:12:53.870 2 DEBUG nova.compute.manager [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Oct 08 19:12:53 compute-0 nova_compute[117514]: 2025-10-08 19:12:53.943 2 DEBUG oslo_concurrency.lockutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:12:53 compute-0 nova_compute[117514]: 2025-10-08 19:12:53.943 2 DEBUG oslo_concurrency.lockutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:12:53 compute-0 nova_compute[117514]: 2025-10-08 19:12:53.954 2 DEBUG nova.virt.hardware [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Oct 08 19:12:53 compute-0 nova_compute[117514]: 2025-10-08 19:12:53.954 2 INFO nova.compute.claims [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Claim successful on node compute-0.ctlplane.example.com
Oct 08 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.065 2 DEBUG nova.compute.provider_tree [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 08 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.079 2 DEBUG nova.scheduler.client.report [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 08 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.100 2 DEBUG oslo_concurrency.lockutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.157s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.101 2 DEBUG nova.compute.manager [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Oct 08 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.150 2 DEBUG nova.compute.manager [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Oct 08 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.151 2 DEBUG nova.network.neutron [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Oct 08 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.174 2 INFO nova.virt.libvirt.driver [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 08 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.194 2 DEBUG nova.compute.manager [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Oct 08 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.282 2 DEBUG nova.compute.manager [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Oct 08 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.283 2 DEBUG nova.virt.libvirt.driver [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Oct 08 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.284 2 INFO nova.virt.libvirt.driver [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Creating image(s)
Oct 08 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.284 2 DEBUG oslo_concurrency.lockutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "/var/lib/nova/instances/093d721c-61cb-4fd3-b678-7465d8840cc6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.285 2 DEBUG oslo_concurrency.lockutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "/var/lib/nova/instances/093d721c-61cb-4fd3-b678-7465d8840cc6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.285 2 DEBUG oslo_concurrency.lockutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "/var/lib/nova/instances/093d721c-61cb-4fd3-b678-7465d8840cc6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.296 2 DEBUG oslo_concurrency.processutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.388 2 DEBUG oslo_concurrency.processutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.390 2 DEBUG oslo_concurrency.lockutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "008eb3078b811ee47058b7252a820910c35fc6df" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.391 2 DEBUG oslo_concurrency.lockutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.413 2 DEBUG oslo_concurrency.processutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.492 2 DEBUG oslo_concurrency.processutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.493 2 DEBUG oslo_concurrency.processutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df,backing_fmt=raw /var/lib/nova/instances/093d721c-61cb-4fd3-b678-7465d8840cc6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.524 2 DEBUG oslo_concurrency.processutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df,backing_fmt=raw /var/lib/nova/instances/093d721c-61cb-4fd3-b678-7465d8840cc6/disk 1073741824" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.526 2 DEBUG oslo_concurrency.lockutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.527 2 DEBUG oslo_concurrency.processutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.568 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759950759.5666447, 9064d639-63f3-422f-a67f-7a4dad8d2182 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 08 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.569 2 INFO nova.compute.manager [-] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] VM Stopped (Lifecycle Event)
Oct 08 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.576 2 DEBUG oslo_concurrency.processutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.576 2 DEBUG nova.virt.disk.api [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Checking if we can resize image /var/lib/nova/instances/093d721c-61cb-4fd3-b678-7465d8840cc6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Oct 08 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.577 2 DEBUG oslo_concurrency.processutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/093d721c-61cb-4fd3-b678-7465d8840cc6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.595 2 DEBUG nova.compute.manager [None req-487306c8-8730-4489-b337-0fdab4299ba4 - - - - - -] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.626 2 DEBUG oslo_concurrency.processutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/093d721c-61cb-4fd3-b678-7465d8840cc6/disk --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.628 2 DEBUG nova.virt.disk.api [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Cannot resize image /var/lib/nova/instances/093d721c-61cb-4fd3-b678-7465d8840cc6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Oct 08 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.628 2 DEBUG nova.objects.instance [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'migration_context' on Instance uuid 093d721c-61cb-4fd3-b678-7465d8840cc6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 08 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.651 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.656 2 DEBUG nova.virt.libvirt.driver [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Oct 08 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.657 2 DEBUG nova.virt.libvirt.driver [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Ensure instance console log exists: /var/lib/nova/instances/093d721c-61cb-4fd3-b678-7465d8840cc6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Oct 08 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.657 2 DEBUG oslo_concurrency.lockutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.658 2 DEBUG oslo_concurrency.lockutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.659 2 DEBUG oslo_concurrency.lockutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:12:55 compute-0 nova_compute[117514]: 2025-10-08 19:12:55.340 2 DEBUG nova.policy [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Oct 08 19:12:56 compute-0 nova_compute[117514]: 2025-10-08 19:12:56.190 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:56 compute-0 podman[149049]: 2025-10-08 19:12:56.652639524 +0000 UTC m=+0.072720309 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001)
Oct 08 19:12:57 compute-0 nova_compute[117514]: 2025-10-08 19:12:57.295 2 DEBUG nova.network.neutron [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Successfully updated port: 8b1cf032-8f00-4a3b-a370-211b5b0ca4ce _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Oct 08 19:12:57 compute-0 nova_compute[117514]: 2025-10-08 19:12:57.308 2 DEBUG oslo_concurrency.lockutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "refresh_cache-093d721c-61cb-4fd3-b678-7465d8840cc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:12:57 compute-0 nova_compute[117514]: 2025-10-08 19:12:57.308 2 DEBUG oslo_concurrency.lockutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquired lock "refresh_cache-093d721c-61cb-4fd3-b678-7465d8840cc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:12:57 compute-0 nova_compute[117514]: 2025-10-08 19:12:57.308 2 DEBUG nova.network.neutron [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 08 19:12:57 compute-0 nova_compute[117514]: 2025-10-08 19:12:57.390 2 DEBUG nova.compute.manager [req-ffe03cf9-9031-4704-b71b-98e12400eee4 req-1ddb67d5-c9d4-477e-a26a-1917d51ec9af bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Received event network-changed-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:12:57 compute-0 nova_compute[117514]: 2025-10-08 19:12:57.390 2 DEBUG nova.compute.manager [req-ffe03cf9-9031-4704-b71b-98e12400eee4 req-1ddb67d5-c9d4-477e-a26a-1917d51ec9af bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Refreshing instance network info cache due to event network-changed-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 08 19:12:57 compute-0 nova_compute[117514]: 2025-10-08 19:12:57.390 2 DEBUG oslo_concurrency.lockutils [req-ffe03cf9-9031-4704-b71b-98e12400eee4 req-1ddb67d5-c9d4-477e-a26a-1917d51ec9af bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-093d721c-61cb-4fd3-b678-7465d8840cc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:12:57 compute-0 nova_compute[117514]: 2025-10-08 19:12:57.451 2 DEBUG nova.network.neutron [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Oct 08 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.171 2 DEBUG nova.network.neutron [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Updating instance_info_cache with network_info: [{"id": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "address": "fa:16:3e:78:c5:0e", "network": {"id": "9bec1de5-8be3-4df6-b90a-943d76fedc48", "bridge": "br-int", "label": "tempest-network-smoke--1142604311", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b1cf032-8f", "ovs_interfaceid": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.193 2 DEBUG oslo_concurrency.lockutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Releasing lock "refresh_cache-093d721c-61cb-4fd3-b678-7465d8840cc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.194 2 DEBUG nova.compute.manager [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Instance network_info: |[{"id": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "address": "fa:16:3e:78:c5:0e", "network": {"id": "9bec1de5-8be3-4df6-b90a-943d76fedc48", "bridge": "br-int", "label": "tempest-network-smoke--1142604311", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b1cf032-8f", "ovs_interfaceid": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Oct 08 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.195 2 DEBUG oslo_concurrency.lockutils [req-ffe03cf9-9031-4704-b71b-98e12400eee4 req-1ddb67d5-c9d4-477e-a26a-1917d51ec9af bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-093d721c-61cb-4fd3-b678-7465d8840cc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.196 2 DEBUG nova.network.neutron [req-ffe03cf9-9031-4704-b71b-98e12400eee4 req-1ddb67d5-c9d4-477e-a26a-1917d51ec9af bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Refreshing network info cache for port 8b1cf032-8f00-4a3b-a370-211b5b0ca4ce _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 08 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.203 2 DEBUG nova.virt.libvirt.driver [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Start _get_guest_xml network_info=[{"id": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "address": "fa:16:3e:78:c5:0e", "network": {"id": "9bec1de5-8be3-4df6-b90a-943d76fedc48", "bridge": "br-int", "label": "tempest-network-smoke--1142604311", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b1cf032-8f", "ovs_interfaceid": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T19:05:11Z,direct_url=<?>,disk_format='qcow2',id=23cfa426-7011-4566-992d-1c7af39f70dd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0776a2a010754884a7b224f3b08ef53b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T19:05:12Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'guest_format': None, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_options': None, 'image_id': '23cfa426-7011-4566-992d-1c7af39f70dd'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Oct 08 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.210 2 WARNING nova.virt.libvirt.driver [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 08 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.219 2 DEBUG nova.virt.libvirt.host [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Oct 08 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.220 2 DEBUG nova.virt.libvirt.host [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Oct 08 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.226 2 DEBUG nova.virt.libvirt.host [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Oct 08 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.227 2 DEBUG nova.virt.libvirt.host [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Oct 08 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.227 2 DEBUG nova.virt.libvirt.driver [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Oct 08 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.228 2 DEBUG nova.virt.hardware [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T19:05:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='e8a148fc-4419-4813-98ff-a17e2a95609e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T19:05:11Z,direct_url=<?>,disk_format='qcow2',id=23cfa426-7011-4566-992d-1c7af39f70dd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0776a2a010754884a7b224f3b08ef53b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T19:05:12Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Oct 08 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.228 2 DEBUG nova.virt.hardware [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Oct 08 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.229 2 DEBUG nova.virt.hardware [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Oct 08 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.229 2 DEBUG nova.virt.hardware [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Oct 08 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.229 2 DEBUG nova.virt.hardware [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Oct 08 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.229 2 DEBUG nova.virt.hardware [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Oct 08 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.230 2 DEBUG nova.virt.hardware [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Oct 08 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.230 2 DEBUG nova.virt.hardware [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Oct 08 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.230 2 DEBUG nova.virt.hardware [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Oct 08 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.231 2 DEBUG nova.virt.hardware [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Oct 08 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.231 2 DEBUG nova.virt.hardware [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Oct 08 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.235 2 DEBUG nova.virt.libvirt.vif [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T19:12:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1974827177',display_name='tempest-TestNetworkBasicOps-server-1974827177',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1974827177',id=9,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLyoze6rxyNRjrKnZ/n+vsTth9kzwYzz/7DU1WtoejT8IDCjJBZl23bG4N5vxWcqQprun8odMD7xEnPv//MudkIlq44roa1e3u7lgMT8KOfJfpcO6Gbpp6ERjS4fOIF90w==',key_name='tempest-TestNetworkBasicOps-106019254',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-kf002v69',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T19:12:54Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=093d721c-61cb-4fd3-b678-7465d8840cc6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "address": "fa:16:3e:78:c5:0e", "network": {"id": "9bec1de5-8be3-4df6-b90a-943d76fedc48", "bridge": "br-int", "label": "tempest-network-smoke--1142604311", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b1cf032-8f", "ovs_interfaceid": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Oct 08 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.236 2 DEBUG nova.network.os_vif_util [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "address": "fa:16:3e:78:c5:0e", "network": {"id": "9bec1de5-8be3-4df6-b90a-943d76fedc48", "bridge": "br-int", "label": "tempest-network-smoke--1142604311", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b1cf032-8f", "ovs_interfaceid": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 08 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.236 2 DEBUG nova.network.os_vif_util [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:78:c5:0e,bridge_name='br-int',has_traffic_filtering=True,id=8b1cf032-8f00-4a3b-a370-211b5b0ca4ce,network=Network(9bec1de5-8be3-4df6-b90a-943d76fedc48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8b1cf032-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 08 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.237 2 DEBUG nova.objects.instance [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 093d721c-61cb-4fd3-b678-7465d8840cc6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 08 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.254 2 DEBUG nova.virt.libvirt.driver [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] End _get_guest_xml xml=<domain type="kvm">
Oct 08 19:12:59 compute-0 nova_compute[117514]:   <uuid>093d721c-61cb-4fd3-b678-7465d8840cc6</uuid>
Oct 08 19:12:59 compute-0 nova_compute[117514]:   <name>instance-00000009</name>
Oct 08 19:12:59 compute-0 nova_compute[117514]:   <memory>131072</memory>
Oct 08 19:12:59 compute-0 nova_compute[117514]:   <vcpu>1</vcpu>
Oct 08 19:12:59 compute-0 nova_compute[117514]:   <metadata>
Oct 08 19:12:59 compute-0 nova_compute[117514]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 08 19:12:59 compute-0 nova_compute[117514]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct 08 19:12:59 compute-0 nova_compute[117514]:       <nova:name>tempest-TestNetworkBasicOps-server-1974827177</nova:name>
Oct 08 19:12:59 compute-0 nova_compute[117514]:       <nova:creationTime>2025-10-08 19:12:59</nova:creationTime>
Oct 08 19:12:59 compute-0 nova_compute[117514]:       <nova:flavor name="m1.nano">
Oct 08 19:12:59 compute-0 nova_compute[117514]:         <nova:memory>128</nova:memory>
Oct 08 19:12:59 compute-0 nova_compute[117514]:         <nova:disk>1</nova:disk>
Oct 08 19:12:59 compute-0 nova_compute[117514]:         <nova:swap>0</nova:swap>
Oct 08 19:12:59 compute-0 nova_compute[117514]:         <nova:ephemeral>0</nova:ephemeral>
Oct 08 19:12:59 compute-0 nova_compute[117514]:         <nova:vcpus>1</nova:vcpus>
Oct 08 19:12:59 compute-0 nova_compute[117514]:       </nova:flavor>
Oct 08 19:12:59 compute-0 nova_compute[117514]:       <nova:owner>
Oct 08 19:12:59 compute-0 nova_compute[117514]:         <nova:user uuid="efdb1424acdb478684cdb088b373ba05">tempest-TestNetworkBasicOps-1122149477-project-member</nova:user>
Oct 08 19:12:59 compute-0 nova_compute[117514]:         <nova:project uuid="b7f7c752a9c5498f8eda73e461895ac9">tempest-TestNetworkBasicOps-1122149477</nova:project>
Oct 08 19:12:59 compute-0 nova_compute[117514]:       </nova:owner>
Oct 08 19:12:59 compute-0 nova_compute[117514]:       <nova:root type="image" uuid="23cfa426-7011-4566-992d-1c7af39f70dd"/>
Oct 08 19:12:59 compute-0 nova_compute[117514]:       <nova:ports>
Oct 08 19:12:59 compute-0 nova_compute[117514]:         <nova:port uuid="8b1cf032-8f00-4a3b-a370-211b5b0ca4ce">
Oct 08 19:12:59 compute-0 nova_compute[117514]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Oct 08 19:12:59 compute-0 nova_compute[117514]:         </nova:port>
Oct 08 19:12:59 compute-0 nova_compute[117514]:       </nova:ports>
Oct 08 19:12:59 compute-0 nova_compute[117514]:     </nova:instance>
Oct 08 19:12:59 compute-0 nova_compute[117514]:   </metadata>
Oct 08 19:12:59 compute-0 nova_compute[117514]:   <sysinfo type="smbios">
Oct 08 19:12:59 compute-0 nova_compute[117514]:     <system>
Oct 08 19:12:59 compute-0 nova_compute[117514]:       <entry name="manufacturer">RDO</entry>
Oct 08 19:12:59 compute-0 nova_compute[117514]:       <entry name="product">OpenStack Compute</entry>
Oct 08 19:12:59 compute-0 nova_compute[117514]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct 08 19:12:59 compute-0 nova_compute[117514]:       <entry name="serial">093d721c-61cb-4fd3-b678-7465d8840cc6</entry>
Oct 08 19:12:59 compute-0 nova_compute[117514]:       <entry name="uuid">093d721c-61cb-4fd3-b678-7465d8840cc6</entry>
Oct 08 19:12:59 compute-0 nova_compute[117514]:       <entry name="family">Virtual Machine</entry>
Oct 08 19:12:59 compute-0 nova_compute[117514]:     </system>
Oct 08 19:12:59 compute-0 nova_compute[117514]:   </sysinfo>
Oct 08 19:12:59 compute-0 nova_compute[117514]:   <os>
Oct 08 19:12:59 compute-0 nova_compute[117514]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 08 19:12:59 compute-0 nova_compute[117514]:     <boot dev="hd"/>
Oct 08 19:12:59 compute-0 nova_compute[117514]:     <smbios mode="sysinfo"/>
Oct 08 19:12:59 compute-0 nova_compute[117514]:   </os>
Oct 08 19:12:59 compute-0 nova_compute[117514]:   <features>
Oct 08 19:12:59 compute-0 nova_compute[117514]:     <acpi/>
Oct 08 19:12:59 compute-0 nova_compute[117514]:     <apic/>
Oct 08 19:12:59 compute-0 nova_compute[117514]:     <vmcoreinfo/>
Oct 08 19:12:59 compute-0 nova_compute[117514]:   </features>
Oct 08 19:12:59 compute-0 nova_compute[117514]:   <clock offset="utc">
Oct 08 19:12:59 compute-0 nova_compute[117514]:     <timer name="pit" tickpolicy="delay"/>
Oct 08 19:12:59 compute-0 nova_compute[117514]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 08 19:12:59 compute-0 nova_compute[117514]:     <timer name="hpet" present="no"/>
Oct 08 19:12:59 compute-0 nova_compute[117514]:   </clock>
Oct 08 19:12:59 compute-0 nova_compute[117514]:   <cpu mode="host-model" match="exact">
Oct 08 19:12:59 compute-0 nova_compute[117514]:     <topology sockets="1" cores="1" threads="1"/>
Oct 08 19:12:59 compute-0 nova_compute[117514]:   </cpu>
Oct 08 19:12:59 compute-0 nova_compute[117514]:   <devices>
Oct 08 19:12:59 compute-0 nova_compute[117514]:     <disk type="file" device="disk">
Oct 08 19:12:59 compute-0 nova_compute[117514]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 08 19:12:59 compute-0 nova_compute[117514]:       <source file="/var/lib/nova/instances/093d721c-61cb-4fd3-b678-7465d8840cc6/disk"/>
Oct 08 19:12:59 compute-0 nova_compute[117514]:       <target dev="vda" bus="virtio"/>
Oct 08 19:12:59 compute-0 nova_compute[117514]:     </disk>
Oct 08 19:12:59 compute-0 nova_compute[117514]:     <disk type="file" device="cdrom">
Oct 08 19:12:59 compute-0 nova_compute[117514]:       <driver name="qemu" type="raw" cache="none"/>
Oct 08 19:12:59 compute-0 nova_compute[117514]:       <source file="/var/lib/nova/instances/093d721c-61cb-4fd3-b678-7465d8840cc6/disk.config"/>
Oct 08 19:12:59 compute-0 nova_compute[117514]:       <target dev="sda" bus="sata"/>
Oct 08 19:12:59 compute-0 nova_compute[117514]:     </disk>
Oct 08 19:12:59 compute-0 nova_compute[117514]:     <interface type="ethernet">
Oct 08 19:12:59 compute-0 nova_compute[117514]:       <mac address="fa:16:3e:78:c5:0e"/>
Oct 08 19:12:59 compute-0 nova_compute[117514]:       <model type="virtio"/>
Oct 08 19:12:59 compute-0 nova_compute[117514]:       <driver name="vhost" rx_queue_size="512"/>
Oct 08 19:12:59 compute-0 nova_compute[117514]:       <mtu size="1442"/>
Oct 08 19:12:59 compute-0 nova_compute[117514]:       <target dev="tap8b1cf032-8f"/>
Oct 08 19:12:59 compute-0 nova_compute[117514]:     </interface>
Oct 08 19:12:59 compute-0 nova_compute[117514]:     <serial type="pty">
Oct 08 19:12:59 compute-0 nova_compute[117514]:       <log file="/var/lib/nova/instances/093d721c-61cb-4fd3-b678-7465d8840cc6/console.log" append="off"/>
Oct 08 19:12:59 compute-0 nova_compute[117514]:     </serial>
Oct 08 19:12:59 compute-0 nova_compute[117514]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 08 19:12:59 compute-0 nova_compute[117514]:     <video>
Oct 08 19:12:59 compute-0 nova_compute[117514]:       <model type="virtio"/>
Oct 08 19:12:59 compute-0 nova_compute[117514]:     </video>
Oct 08 19:12:59 compute-0 nova_compute[117514]:     <input type="tablet" bus="usb"/>
Oct 08 19:12:59 compute-0 nova_compute[117514]:     <rng model="virtio">
Oct 08 19:12:59 compute-0 nova_compute[117514]:       <backend model="random">/dev/urandom</backend>
Oct 08 19:12:59 compute-0 nova_compute[117514]:     </rng>
Oct 08 19:12:59 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root"/>
Oct 08 19:12:59 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:12:59 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:12:59 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:12:59 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:12:59 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:12:59 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:12:59 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:12:59 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:12:59 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:12:59 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:12:59 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:12:59 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:12:59 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:12:59 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:12:59 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:12:59 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:12:59 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:12:59 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:12:59 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:12:59 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:12:59 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:12:59 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:12:59 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:12:59 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:12:59 compute-0 nova_compute[117514]:     <controller type="usb" index="0"/>
Oct 08 19:12:59 compute-0 nova_compute[117514]:     <memballoon model="virtio">
Oct 08 19:12:59 compute-0 nova_compute[117514]:       <stats period="10"/>
Oct 08 19:12:59 compute-0 nova_compute[117514]:     </memballoon>
Oct 08 19:12:59 compute-0 nova_compute[117514]:   </devices>
Oct 08 19:12:59 compute-0 nova_compute[117514]: </domain>
Oct 08 19:12:59 compute-0 nova_compute[117514]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Oct 08 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.256 2 DEBUG nova.compute.manager [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Preparing to wait for external event network-vif-plugged-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Oct 08 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.256 2 DEBUG oslo_concurrency.lockutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "093d721c-61cb-4fd3-b678-7465d8840cc6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.256 2 DEBUG oslo_concurrency.lockutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "093d721c-61cb-4fd3-b678-7465d8840cc6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.256 2 DEBUG oslo_concurrency.lockutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "093d721c-61cb-4fd3-b678-7465d8840cc6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.257 2 DEBUG nova.virt.libvirt.vif [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T19:12:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1974827177',display_name='tempest-TestNetworkBasicOps-server-1974827177',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1974827177',id=9,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLyoze6rxyNRjrKnZ/n+vsTth9kzwYzz/7DU1WtoejT8IDCjJBZl23bG4N5vxWcqQprun8odMD7xEnPv//MudkIlq44roa1e3u7lgMT8KOfJfpcO6Gbpp6ERjS4fOIF90w==',key_name='tempest-TestNetworkBasicOps-106019254',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-kf002v69',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T19:12:54Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=093d721c-61cb-4fd3-b678-7465d8840cc6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "address": "fa:16:3e:78:c5:0e", "network": {"id": "9bec1de5-8be3-4df6-b90a-943d76fedc48", "bridge": "br-int", "label": "tempest-network-smoke--1142604311", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b1cf032-8f", "ovs_interfaceid": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Oct 08 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.258 2 DEBUG nova.network.os_vif_util [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "address": "fa:16:3e:78:c5:0e", "network": {"id": "9bec1de5-8be3-4df6-b90a-943d76fedc48", "bridge": "br-int", "label": "tempest-network-smoke--1142604311", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b1cf032-8f", "ovs_interfaceid": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 08 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.258 2 DEBUG nova.network.os_vif_util [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:78:c5:0e,bridge_name='br-int',has_traffic_filtering=True,id=8b1cf032-8f00-4a3b-a370-211b5b0ca4ce,network=Network(9bec1de5-8be3-4df6-b90a-943d76fedc48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8b1cf032-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 08 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.259 2 DEBUG os_vif [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:78:c5:0e,bridge_name='br-int',has_traffic_filtering=True,id=8b1cf032-8f00-4a3b-a370-211b5b0ca4ce,network=Network(9bec1de5-8be3-4df6-b90a-943d76fedc48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8b1cf032-8f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Oct 08 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.260 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.260 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 08 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.263 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.264 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8b1cf032-8f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.264 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8b1cf032-8f, col_values=(('external_ids', {'iface-id': '8b1cf032-8f00-4a3b-a370-211b5b0ca4ce', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:78:c5:0e', 'vm-uuid': '093d721c-61cb-4fd3-b678-7465d8840cc6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:59 compute-0 NetworkManager[1035]: <info>  [1759950779.2676] manager: (tap8b1cf032-8f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/70)
Oct 08 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 08 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.275 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.276 2 INFO os_vif [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:78:c5:0e,bridge_name='br-int',has_traffic_filtering=True,id=8b1cf032-8f00-4a3b-a370-211b5b0ca4ce,network=Network(9bec1de5-8be3-4df6-b90a-943d76fedc48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8b1cf032-8f')
Oct 08 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.457 2 DEBUG nova.virt.libvirt.driver [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 08 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.458 2 DEBUG nova.virt.libvirt.driver [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 08 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.458 2 DEBUG nova.virt.libvirt.driver [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No VIF found with MAC fa:16:3e:78:c5:0e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Oct 08 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.459 2 INFO nova.virt.libvirt.driver [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Using config drive
Oct 08 19:13:00 compute-0 nova_compute[117514]: 2025-10-08 19:13:00.221 2 INFO nova.virt.libvirt.driver [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Creating config drive at /var/lib/nova/instances/093d721c-61cb-4fd3-b678-7465d8840cc6/disk.config
Oct 08 19:13:00 compute-0 nova_compute[117514]: 2025-10-08 19:13:00.227 2 DEBUG oslo_concurrency.processutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/093d721c-61cb-4fd3-b678-7465d8840cc6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4ns4exzo execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:13:00 compute-0 nova_compute[117514]: 2025-10-08 19:13:00.349 2 DEBUG oslo_concurrency.processutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/093d721c-61cb-4fd3-b678-7465d8840cc6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4ns4exzo" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:13:00 compute-0 kernel: tap8b1cf032-8f: entered promiscuous mode
Oct 08 19:13:00 compute-0 NetworkManager[1035]: <info>  [1759950780.4275] manager: (tap8b1cf032-8f): new Tun device (/org/freedesktop/NetworkManager/Devices/71)
Oct 08 19:13:00 compute-0 ovn_controller[19759]: 2025-10-08T19:13:00Z|00119|binding|INFO|Claiming lport 8b1cf032-8f00-4a3b-a370-211b5b0ca4ce for this chassis.
Oct 08 19:13:00 compute-0 ovn_controller[19759]: 2025-10-08T19:13:00Z|00120|binding|INFO|8b1cf032-8f00-4a3b-a370-211b5b0ca4ce: Claiming fa:16:3e:78:c5:0e 10.100.0.4
Oct 08 19:13:00 compute-0 nova_compute[117514]: 2025-10-08 19:13:00.429 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:13:00 compute-0 systemd-udevd[149089]: Network interface NamePolicy= disabled on kernel command line.
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.454 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:78:c5:0e 10.100.0.4'], port_security=['fa:16:3e:78:c5:0e 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-94131643', 'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '093d721c-61cb-4fd3-b678-7465d8840cc6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9bec1de5-8be3-4df6-b90a-943d76fedc48', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-94131643', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '7', 'neutron:security_group_ids': 'be57f10c-6afc-483d-a1fa-fab953b8fe3e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.175'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1d3717a1-7175-40f6-8720-19b5f1d50c4f, chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=8b1cf032-8f00-4a3b-a370-211b5b0ca4ce) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 08 19:13:00 compute-0 ovn_controller[19759]: 2025-10-08T19:13:00Z|00121|binding|INFO|Setting lport 8b1cf032-8f00-4a3b-a370-211b5b0ca4ce ovn-installed in OVS
Oct 08 19:13:00 compute-0 ovn_controller[19759]: 2025-10-08T19:13:00Z|00122|binding|INFO|Setting lport 8b1cf032-8f00-4a3b-a370-211b5b0ca4ce up in Southbound
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.455 28643 INFO neutron.agent.ovn.metadata.agent [-] Port 8b1cf032-8f00-4a3b-a370-211b5b0ca4ce in datapath 9bec1de5-8be3-4df6-b90a-943d76fedc48 bound to our chassis
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.456 28643 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9bec1de5-8be3-4df6-b90a-943d76fedc48
Oct 08 19:13:00 compute-0 nova_compute[117514]: 2025-10-08 19:13:00.456 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:13:00 compute-0 nova_compute[117514]: 2025-10-08 19:13:00.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:13:00 compute-0 NetworkManager[1035]: <info>  [1759950780.4720] device (tap8b1cf032-8f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 08 19:13:00 compute-0 NetworkManager[1035]: <info>  [1759950780.4743] device (tap8b1cf032-8f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.470 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[51aade05-2ce0-457b-880d-b2cbb8b85f0e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.472 28643 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9bec1de5-81 in ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.474 144726 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9bec1de5-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.474 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[e3616fae-ec15-4429-b09f-cfd3aba805c1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.476 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[4672f836-921b-4dea-805c-b7766a46359c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:13:00 compute-0 systemd-machined[77568]: New machine qemu-9-instance-00000009.
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.492 28783 DEBUG oslo.privsep.daemon [-] privsep: reply[38af38ec-9da5-4fe7-a462-ce506eef82f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.507 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[0c2c9c94-3ef8-4fe3-a333-38edca5f006a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:13:00 compute-0 systemd[1]: Started Virtual Machine qemu-9-instance-00000009.
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.536 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[dbc2fb2a-9365-4a02-bc77-78a5bd529c66]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.542 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[ad658083-ae39-4345-bcf5-46387fe93d82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:13:00 compute-0 NetworkManager[1035]: <info>  [1759950780.5445] manager: (tap9bec1de5-80): new Veth device (/org/freedesktop/NetworkManager/Devices/72)
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.582 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[f6ba6c15-9470-47d5-881a-fe5f010c9b50]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.586 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[b1560fca-b16f-44dc-b2e4-0d34bff21b12]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:13:00 compute-0 NetworkManager[1035]: <info>  [1759950780.6090] device (tap9bec1de5-80): carrier: link connected
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.615 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[c5478c87-3d66-468c-ae6e-b03955a990ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.633 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[713db871-2f5c-4d9c-93fe-344bc10cacbb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9bec1de5-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e5:ad:60'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 145118, 'reachable_time': 22925, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 149125, 'error': None, 'target': 'ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:13:00 compute-0 nova_compute[117514]: 2025-10-08 19:13:00.641 2 DEBUG nova.compute.manager [req-493149bb-30ab-4d0c-871c-99c6e6a26035 req-2f1d66b2-4fbb-4b8a-b8ff-cb4075d0e9d4 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Received event network-vif-plugged-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:13:00 compute-0 nova_compute[117514]: 2025-10-08 19:13:00.642 2 DEBUG oslo_concurrency.lockutils [req-493149bb-30ab-4d0c-871c-99c6e6a26035 req-2f1d66b2-4fbb-4b8a-b8ff-cb4075d0e9d4 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "093d721c-61cb-4fd3-b678-7465d8840cc6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:13:00 compute-0 nova_compute[117514]: 2025-10-08 19:13:00.643 2 DEBUG oslo_concurrency.lockutils [req-493149bb-30ab-4d0c-871c-99c6e6a26035 req-2f1d66b2-4fbb-4b8a-b8ff-cb4075d0e9d4 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "093d721c-61cb-4fd3-b678-7465d8840cc6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:13:00 compute-0 nova_compute[117514]: 2025-10-08 19:13:00.643 2 DEBUG oslo_concurrency.lockutils [req-493149bb-30ab-4d0c-871c-99c6e6a26035 req-2f1d66b2-4fbb-4b8a-b8ff-cb4075d0e9d4 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "093d721c-61cb-4fd3-b678-7465d8840cc6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:13:00 compute-0 nova_compute[117514]: 2025-10-08 19:13:00.644 2 DEBUG nova.compute.manager [req-493149bb-30ab-4d0c-871c-99c6e6a26035 req-2f1d66b2-4fbb-4b8a-b8ff-cb4075d0e9d4 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Processing event network-vif-plugged-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.656 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[08ef3c1e-b7f8-451e-8d5e-2701a30bb5ab]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee5:ad60'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 145118, 'tstamp': 145118}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 149126, 'error': None, 'target': 'ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.677 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[f86da07c-6473-4bb5-b758-73915834b1bf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9bec1de5-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e5:ad:60'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 145118, 'reachable_time': 22925, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 168, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 168, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 149127, 'error': None, 'target': 'ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.723 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[cf575ec2-4ecd-42b4-9f36-b68e5b7fa8a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.813 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[fceb1425-7f37-464b-bab4-a19c3290eca1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.816 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9bec1de5-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.816 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.817 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9bec1de5-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:13:00 compute-0 nova_compute[117514]: 2025-10-08 19:13:00.820 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:13:00 compute-0 NetworkManager[1035]: <info>  [1759950780.8207] manager: (tap9bec1de5-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/73)
Oct 08 19:13:00 compute-0 kernel: tap9bec1de5-80: entered promiscuous mode
Oct 08 19:13:00 compute-0 nova_compute[117514]: 2025-10-08 19:13:00.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.825 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9bec1de5-80, col_values=(('external_ids', {'iface-id': 'b57e5c57-68fb-43de-8f87-98a853dc8be7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:13:00 compute-0 nova_compute[117514]: 2025-10-08 19:13:00.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:13:00 compute-0 ovn_controller[19759]: 2025-10-08T19:13:00Z|00123|binding|INFO|Releasing lport b57e5c57-68fb-43de-8f87-98a853dc8be7 from this chassis (sb_readonly=0)
Oct 08 19:13:00 compute-0 nova_compute[117514]: 2025-10-08 19:13:00.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.851 28643 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9bec1de5-8be3-4df6-b90a-943d76fedc48.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9bec1de5-8be3-4df6-b90a-943d76fedc48.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.852 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[910c6230-b3ce-4481-ac85-58b9189dc924]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.853 28643 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]: global
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]:     log         /dev/log local0 debug
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]:     log-tag     haproxy-metadata-proxy-9bec1de5-8be3-4df6-b90a-943d76fedc48
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]:     user        root
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]:     group       root
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]:     maxconn     1024
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]:     pidfile     /var/lib/neutron/external/pids/9bec1de5-8be3-4df6-b90a-943d76fedc48.pid.haproxy
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]:     daemon
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]: 
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]: defaults
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]:     log global
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]:     mode http
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]:     option httplog
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]:     option dontlognull
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]:     option http-server-close
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]:     option forwardfor
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]:     retries                 3
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]:     timeout http-request    30s
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]:     timeout connect         30s
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]:     timeout client          32s
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]:     timeout server          32s
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]:     timeout http-keep-alive 30s
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]: 
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]: 
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]: listen listener
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]:     bind 169.254.169.254:80
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]:     server metadata /var/lib/neutron/metadata_proxy
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]:     http-request add-header X-OVN-Network-ID 9bec1de5-8be3-4df6-b90a-943d76fedc48
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Oct 08 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.854 28643 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48', 'env', 'PROCESS_TAG=haproxy-9bec1de5-8be3-4df6-b90a-943d76fedc48', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9bec1de5-8be3-4df6-b90a-943d76fedc48.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Oct 08 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.131 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:13:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:01.131 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a6:75:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '5e:14:dd:63:55:2a'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 08 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.161 2 DEBUG nova.network.neutron [req-ffe03cf9-9031-4704-b71b-98e12400eee4 req-1ddb67d5-c9d4-477e-a26a-1917d51ec9af bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Updated VIF entry in instance network info cache for port 8b1cf032-8f00-4a3b-a370-211b5b0ca4ce. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 08 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.161 2 DEBUG nova.network.neutron [req-ffe03cf9-9031-4704-b71b-98e12400eee4 req-1ddb67d5-c9d4-477e-a26a-1917d51ec9af bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Updating instance_info_cache with network_info: [{"id": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "address": "fa:16:3e:78:c5:0e", "network": {"id": "9bec1de5-8be3-4df6-b90a-943d76fedc48", "bridge": "br-int", "label": "tempest-network-smoke--1142604311", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b1cf032-8f", "ovs_interfaceid": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.176 2 DEBUG oslo_concurrency.lockutils [req-ffe03cf9-9031-4704-b71b-98e12400eee4 req-1ddb67d5-c9d4-477e-a26a-1917d51ec9af bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-093d721c-61cb-4fd3-b678-7465d8840cc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.192 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:13:01 compute-0 podman[149164]: 2025-10-08 19:13:01.249115905 +0000 UTC m=+0.057414577 container create bb9f09cd0eed6235cc64b81363d45392abfcd27382ba618570c50d74721f0177 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48, tcib_managed=true, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct 08 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.259 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950781.2589304, 093d721c-61cb-4fd3-b678-7465d8840cc6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 08 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.260 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] VM Started (Lifecycle Event)
Oct 08 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.262 2 DEBUG nova.compute.manager [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Oct 08 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.266 2 DEBUG nova.virt.libvirt.driver [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Oct 08 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.268 2 INFO nova.virt.libvirt.driver [-] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Instance spawned successfully.
Oct 08 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.269 2 DEBUG nova.virt.libvirt.driver [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Oct 08 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.279 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.281 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 08 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.289 2 DEBUG nova.virt.libvirt.driver [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.289 2 DEBUG nova.virt.libvirt.driver [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.290 2 DEBUG nova.virt.libvirt.driver [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.290 2 DEBUG nova.virt.libvirt.driver [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.290 2 DEBUG nova.virt.libvirt.driver [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.290 2 DEBUG nova.virt.libvirt.driver [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:13:01 compute-0 systemd[1]: Started libpod-conmon-bb9f09cd0eed6235cc64b81363d45392abfcd27382ba618570c50d74721f0177.scope.
Oct 08 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.297 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 08 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.297 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950781.261934, 093d721c-61cb-4fd3-b678-7465d8840cc6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 08 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.297 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] VM Paused (Lifecycle Event)
Oct 08 19:13:01 compute-0 podman[149164]: 2025-10-08 19:13:01.219354397 +0000 UTC m=+0.027653079 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7
Oct 08 19:13:01 compute-0 systemd[1]: Started libcrun container.
Oct 08 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.320 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:13:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd409b632a0b96c86655809d2dc0d8db1d12c174e3e7557cccb2a1f4c341ed48/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 08 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.324 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950781.2645817, 093d721c-61cb-4fd3-b678-7465d8840cc6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 08 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.324 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] VM Resumed (Lifecycle Event)
Oct 08 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.339 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:13:01 compute-0 podman[149164]: 2025-10-08 19:13:01.341700237 +0000 UTC m=+0.149998979 container init bb9f09cd0eed6235cc64b81363d45392abfcd27382ba618570c50d74721f0177 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 08 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.341 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 08 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.346 2 INFO nova.compute.manager [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Took 7.06 seconds to spawn the instance on the hypervisor.
Oct 08 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.347 2 DEBUG nova.compute.manager [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:13:01 compute-0 podman[149164]: 2025-10-08 19:13:01.352259051 +0000 UTC m=+0.160557753 container start bb9f09cd0eed6235cc64b81363d45392abfcd27382ba618570c50d74721f0177 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct 08 19:13:01 compute-0 neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48[149180]: [NOTICE]   (149184) : New worker (149186) forked
Oct 08 19:13:01 compute-0 neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48[149180]: [NOTICE]   (149184) : Loading success.
Oct 08 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.373 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 08 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.411 2 INFO nova.compute.manager [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Took 7.50 seconds to build instance.
Oct 08 19:13:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:01.428 28643 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 08 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.430 2 DEBUG oslo_concurrency.lockutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "093d721c-61cb-4fd3-b678-7465d8840cc6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.581s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:13:02 compute-0 podman[149195]: 2025-10-08 19:13:02.64336497 +0000 UTC m=+0.060889628 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.expose-services=, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vcs-type=git, architecture=x86_64, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_id=edpm, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public)
Oct 08 19:13:02 compute-0 podman[149196]: 2025-10-08 19:13:02.678609236 +0000 UTC m=+0.095322411 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct 08 19:13:02 compute-0 nova_compute[117514]: 2025-10-08 19:13:02.709 2 DEBUG nova.compute.manager [req-16252b51-7bcc-4f60-a65a-662c7cf1c589 req-b27febea-f744-41ef-bbd5-a9a022440b89 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Received event network-vif-plugged-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:13:02 compute-0 nova_compute[117514]: 2025-10-08 19:13:02.710 2 DEBUG oslo_concurrency.lockutils [req-16252b51-7bcc-4f60-a65a-662c7cf1c589 req-b27febea-f744-41ef-bbd5-a9a022440b89 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "093d721c-61cb-4fd3-b678-7465d8840cc6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:13:02 compute-0 nova_compute[117514]: 2025-10-08 19:13:02.710 2 DEBUG oslo_concurrency.lockutils [req-16252b51-7bcc-4f60-a65a-662c7cf1c589 req-b27febea-f744-41ef-bbd5-a9a022440b89 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "093d721c-61cb-4fd3-b678-7465d8840cc6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:13:02 compute-0 nova_compute[117514]: 2025-10-08 19:13:02.710 2 DEBUG oslo_concurrency.lockutils [req-16252b51-7bcc-4f60-a65a-662c7cf1c589 req-b27febea-f744-41ef-bbd5-a9a022440b89 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "093d721c-61cb-4fd3-b678-7465d8840cc6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:13:02 compute-0 nova_compute[117514]: 2025-10-08 19:13:02.711 2 DEBUG nova.compute.manager [req-16252b51-7bcc-4f60-a65a-662c7cf1c589 req-b27febea-f744-41ef-bbd5-a9a022440b89 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] No waiting events found dispatching network-vif-plugged-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 08 19:13:02 compute-0 nova_compute[117514]: 2025-10-08 19:13:02.711 2 WARNING nova.compute.manager [req-16252b51-7bcc-4f60-a65a-662c7cf1c589 req-b27febea-f744-41ef-bbd5-a9a022440b89 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Received unexpected event network-vif-plugged-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce for instance with vm_state active and task_state None.
Oct 08 19:13:03 compute-0 nova_compute[117514]: 2025-10-08 19:13:03.888 2 DEBUG oslo_concurrency.lockutils [None req-3c44e3a1-ee25-4bd5-9884-5c405b1d40a3 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "093d721c-61cb-4fd3-b678-7465d8840cc6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:13:03 compute-0 nova_compute[117514]: 2025-10-08 19:13:03.889 2 DEBUG oslo_concurrency.lockutils [None req-3c44e3a1-ee25-4bd5-9884-5c405b1d40a3 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "093d721c-61cb-4fd3-b678-7465d8840cc6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:13:03 compute-0 nova_compute[117514]: 2025-10-08 19:13:03.889 2 DEBUG oslo_concurrency.lockutils [None req-3c44e3a1-ee25-4bd5-9884-5c405b1d40a3 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "093d721c-61cb-4fd3-b678-7465d8840cc6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:13:03 compute-0 nova_compute[117514]: 2025-10-08 19:13:03.889 2 DEBUG oslo_concurrency.lockutils [None req-3c44e3a1-ee25-4bd5-9884-5c405b1d40a3 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "093d721c-61cb-4fd3-b678-7465d8840cc6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:13:03 compute-0 nova_compute[117514]: 2025-10-08 19:13:03.889 2 DEBUG oslo_concurrency.lockutils [None req-3c44e3a1-ee25-4bd5-9884-5c405b1d40a3 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "093d721c-61cb-4fd3-b678-7465d8840cc6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:13:03 compute-0 nova_compute[117514]: 2025-10-08 19:13:03.891 2 INFO nova.compute.manager [None req-3c44e3a1-ee25-4bd5-9884-5c405b1d40a3 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Terminating instance
Oct 08 19:13:03 compute-0 nova_compute[117514]: 2025-10-08 19:13:03.892 2 DEBUG nova.compute.manager [None req-3c44e3a1-ee25-4bd5-9884-5c405b1d40a3 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Oct 08 19:13:03 compute-0 kernel: tap8b1cf032-8f (unregistering): left promiscuous mode
Oct 08 19:13:03 compute-0 NetworkManager[1035]: <info>  [1759950783.9219] device (tap8b1cf032-8f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 08 19:13:03 compute-0 nova_compute[117514]: 2025-10-08 19:13:03.934 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:13:03 compute-0 ovn_controller[19759]: 2025-10-08T19:13:03Z|00124|binding|INFO|Releasing lport 8b1cf032-8f00-4a3b-a370-211b5b0ca4ce from this chassis (sb_readonly=0)
Oct 08 19:13:03 compute-0 ovn_controller[19759]: 2025-10-08T19:13:03Z|00125|binding|INFO|Setting lport 8b1cf032-8f00-4a3b-a370-211b5b0ca4ce down in Southbound
Oct 08 19:13:03 compute-0 ovn_controller[19759]: 2025-10-08T19:13:03Z|00126|binding|INFO|Removing iface tap8b1cf032-8f ovn-installed in OVS
Oct 08 19:13:03 compute-0 nova_compute[117514]: 2025-10-08 19:13:03.937 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:13:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:03.943 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:78:c5:0e 10.100.0.4'], port_security=['fa:16:3e:78:c5:0e 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-94131643', 'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '093d721c-61cb-4fd3-b678-7465d8840cc6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9bec1de5-8be3-4df6-b90a-943d76fedc48', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-94131643', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '9', 'neutron:security_group_ids': 'be57f10c-6afc-483d-a1fa-fab953b8fe3e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.175', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1d3717a1-7175-40f6-8720-19b5f1d50c4f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=8b1cf032-8f00-4a3b-a370-211b5b0ca4ce) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 08 19:13:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:03.945 28643 INFO neutron.agent.ovn.metadata.agent [-] Port 8b1cf032-8f00-4a3b-a370-211b5b0ca4ce in datapath 9bec1de5-8be3-4df6-b90a-943d76fedc48 unbound from our chassis
Oct 08 19:13:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:03.947 28643 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9bec1de5-8be3-4df6-b90a-943d76fedc48, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 08 19:13:03 compute-0 nova_compute[117514]: 2025-10-08 19:13:03.951 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:13:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:03.949 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[397e1090-0e30-4af6-916d-6a422311cc3e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:13:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:03.950 28643 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48 namespace which is not needed anymore
Oct 08 19:13:03 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000009.scope: Deactivated successfully.
Oct 08 19:13:03 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000009.scope: Consumed 3.382s CPU time.
Oct 08 19:13:03 compute-0 systemd-machined[77568]: Machine qemu-9-instance-00000009 terminated.
Oct 08 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.168 2 INFO nova.virt.libvirt.driver [-] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Instance destroyed successfully.
Oct 08 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.169 2 DEBUG nova.objects.instance [None req-3c44e3a1-ee25-4bd5-9884-5c405b1d40a3 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'resources' on Instance uuid 093d721c-61cb-4fd3-b678-7465d8840cc6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 08 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.184 2 DEBUG nova.virt.libvirt.vif [None req-3c44e3a1-ee25-4bd5-9884-5c405b1d40a3 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T19:12:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1974827177',display_name='tempest-TestNetworkBasicOps-server-1974827177',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1974827177',id=9,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLyoze6rxyNRjrKnZ/n+vsTth9kzwYzz/7DU1WtoejT8IDCjJBZl23bG4N5vxWcqQprun8odMD7xEnPv//MudkIlq44roa1e3u7lgMT8KOfJfpcO6Gbpp6ERjS4fOIF90w==',key_name='tempest-TestNetworkBasicOps-106019254',keypairs=<?>,launch_index=0,launched_at=2025-10-08T19:13:01Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-kf002v69',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T19:13:01Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=093d721c-61cb-4fd3-b678-7465d8840cc6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "address": "fa:16:3e:78:c5:0e", "network": {"id": "9bec1de5-8be3-4df6-b90a-943d76fedc48", "bridge": "br-int", "label": "tempest-network-smoke--1142604311", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b1cf032-8f", "ovs_interfaceid": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Oct 08 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.184 2 DEBUG nova.network.os_vif_util [None req-3c44e3a1-ee25-4bd5-9884-5c405b1d40a3 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "address": "fa:16:3e:78:c5:0e", "network": {"id": "9bec1de5-8be3-4df6-b90a-943d76fedc48", "bridge": "br-int", "label": "tempest-network-smoke--1142604311", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b1cf032-8f", "ovs_interfaceid": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 08 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.185 2 DEBUG nova.network.os_vif_util [None req-3c44e3a1-ee25-4bd5-9884-5c405b1d40a3 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:78:c5:0e,bridge_name='br-int',has_traffic_filtering=True,id=8b1cf032-8f00-4a3b-a370-211b5b0ca4ce,network=Network(9bec1de5-8be3-4df6-b90a-943d76fedc48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8b1cf032-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 08 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.186 2 DEBUG os_vif [None req-3c44e3a1-ee25-4bd5-9884-5c405b1d40a3 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:78:c5:0e,bridge_name='br-int',has_traffic_filtering=True,id=8b1cf032-8f00-4a3b-a370-211b5b0ca4ce,network=Network(9bec1de5-8be3-4df6-b90a-943d76fedc48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8b1cf032-8f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Oct 08 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.188 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.189 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8b1cf032-8f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.211 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.214 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 08 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.217 2 INFO os_vif [None req-3c44e3a1-ee25-4bd5-9884-5c405b1d40a3 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:78:c5:0e,bridge_name='br-int',has_traffic_filtering=True,id=8b1cf032-8f00-4a3b-a370-211b5b0ca4ce,network=Network(9bec1de5-8be3-4df6-b90a-943d76fedc48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8b1cf032-8f')
Oct 08 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.218 2 INFO nova.virt.libvirt.driver [None req-3c44e3a1-ee25-4bd5-9884-5c405b1d40a3 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Deleting instance files /var/lib/nova/instances/093d721c-61cb-4fd3-b678-7465d8840cc6_del
Oct 08 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.219 2 INFO nova.virt.libvirt.driver [None req-3c44e3a1-ee25-4bd5-9884-5c405b1d40a3 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Deletion of /var/lib/nova/instances/093d721c-61cb-4fd3-b678-7465d8840cc6_del complete
Oct 08 19:13:04 compute-0 neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48[149180]: [NOTICE]   (149184) : haproxy version is 2.8.14-c23fe91
Oct 08 19:13:04 compute-0 neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48[149180]: [NOTICE]   (149184) : path to executable is /usr/sbin/haproxy
Oct 08 19:13:04 compute-0 neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48[149180]: [WARNING]  (149184) : Exiting Master process...
Oct 08 19:13:04 compute-0 neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48[149180]: [WARNING]  (149184) : Exiting Master process...
Oct 08 19:13:04 compute-0 neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48[149180]: [ALERT]    (149184) : Current worker (149186) exited with code 143 (Terminated)
Oct 08 19:13:04 compute-0 neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48[149180]: [WARNING]  (149184) : All workers exited. Exiting... (0)
Oct 08 19:13:04 compute-0 systemd[1]: libpod-bb9f09cd0eed6235cc64b81363d45392abfcd27382ba618570c50d74721f0177.scope: Deactivated successfully.
Oct 08 19:13:04 compute-0 podman[149254]: 2025-10-08 19:13:04.245562545 +0000 UTC m=+0.161859781 container died bb9f09cd0eed6235cc64b81363d45392abfcd27382ba618570c50d74721f0177 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct 08 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.274 2 INFO nova.compute.manager [None req-3c44e3a1-ee25-4bd5-9884-5c405b1d40a3 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Took 0.38 seconds to destroy the instance on the hypervisor.
Oct 08 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.275 2 DEBUG oslo.service.loopingcall [None req-3c44e3a1-ee25-4bd5-9884-5c405b1d40a3 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Oct 08 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.275 2 DEBUG nova.compute.manager [-] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Oct 08 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.275 2 DEBUG nova.network.neutron [-] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Oct 08 19:13:04 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bb9f09cd0eed6235cc64b81363d45392abfcd27382ba618570c50d74721f0177-userdata-shm.mount: Deactivated successfully.
Oct 08 19:13:04 compute-0 systemd[1]: var-lib-containers-storage-overlay-dd409b632a0b96c86655809d2dc0d8db1d12c174e3e7557cccb2a1f4c341ed48-merged.mount: Deactivated successfully.
Oct 08 19:13:04 compute-0 podman[149300]: 2025-10-08 19:13:04.528039344 +0000 UTC m=+0.074784578 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 08 19:13:04 compute-0 podman[149254]: 2025-10-08 19:13:04.559529173 +0000 UTC m=+0.475826409 container cleanup bb9f09cd0eed6235cc64b81363d45392abfcd27382ba618570c50d74721f0177 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct 08 19:13:04 compute-0 systemd[1]: libpod-conmon-bb9f09cd0eed6235cc64b81363d45392abfcd27382ba618570c50d74721f0177.scope: Deactivated successfully.
Oct 08 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.952 2 DEBUG nova.compute.manager [req-d1a279e4-56d3-4bc1-abb2-051b2f2f5169 req-89caeffe-6a81-4739-bdc1-4df12137f4a8 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Received event network-vif-unplugged-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.953 2 DEBUG oslo_concurrency.lockutils [req-d1a279e4-56d3-4bc1-abb2-051b2f2f5169 req-89caeffe-6a81-4739-bdc1-4df12137f4a8 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "093d721c-61cb-4fd3-b678-7465d8840cc6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.954 2 DEBUG oslo_concurrency.lockutils [req-d1a279e4-56d3-4bc1-abb2-051b2f2f5169 req-89caeffe-6a81-4739-bdc1-4df12137f4a8 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "093d721c-61cb-4fd3-b678-7465d8840cc6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.954 2 DEBUG oslo_concurrency.lockutils [req-d1a279e4-56d3-4bc1-abb2-051b2f2f5169 req-89caeffe-6a81-4739-bdc1-4df12137f4a8 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "093d721c-61cb-4fd3-b678-7465d8840cc6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.955 2 DEBUG nova.compute.manager [req-d1a279e4-56d3-4bc1-abb2-051b2f2f5169 req-89caeffe-6a81-4739-bdc1-4df12137f4a8 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] No waiting events found dispatching network-vif-unplugged-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 08 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.955 2 DEBUG nova.compute.manager [req-d1a279e4-56d3-4bc1-abb2-051b2f2f5169 req-89caeffe-6a81-4739-bdc1-4df12137f4a8 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Received event network-vif-unplugged-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Oct 08 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.956 2 DEBUG nova.compute.manager [req-d1a279e4-56d3-4bc1-abb2-051b2f2f5169 req-89caeffe-6a81-4739-bdc1-4df12137f4a8 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Received event network-vif-plugged-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.956 2 DEBUG oslo_concurrency.lockutils [req-d1a279e4-56d3-4bc1-abb2-051b2f2f5169 req-89caeffe-6a81-4739-bdc1-4df12137f4a8 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "093d721c-61cb-4fd3-b678-7465d8840cc6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.956 2 DEBUG oslo_concurrency.lockutils [req-d1a279e4-56d3-4bc1-abb2-051b2f2f5169 req-89caeffe-6a81-4739-bdc1-4df12137f4a8 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "093d721c-61cb-4fd3-b678-7465d8840cc6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.957 2 DEBUG oslo_concurrency.lockutils [req-d1a279e4-56d3-4bc1-abb2-051b2f2f5169 req-89caeffe-6a81-4739-bdc1-4df12137f4a8 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "093d721c-61cb-4fd3-b678-7465d8840cc6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.957 2 DEBUG nova.compute.manager [req-d1a279e4-56d3-4bc1-abb2-051b2f2f5169 req-89caeffe-6a81-4739-bdc1-4df12137f4a8 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] No waiting events found dispatching network-vif-plugged-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 08 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.958 2 WARNING nova.compute.manager [req-d1a279e4-56d3-4bc1-abb2-051b2f2f5169 req-89caeffe-6a81-4739-bdc1-4df12137f4a8 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Received unexpected event network-vif-plugged-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce for instance with vm_state active and task_state deleting.
Oct 08 19:13:05 compute-0 podman[149333]: 2025-10-08 19:13:05.056258714 +0000 UTC m=+0.467065137 container remove bb9f09cd0eed6235cc64b81363d45392abfcd27382ba618570c50d74721f0177 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct 08 19:13:05 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:05.065 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[c7716e01-bc45-424d-9f28-89070d555139]: (4, ('Wed Oct  8 07:13:04 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48 (bb9f09cd0eed6235cc64b81363d45392abfcd27382ba618570c50d74721f0177)\nbb9f09cd0eed6235cc64b81363d45392abfcd27382ba618570c50d74721f0177\nWed Oct  8 07:13:04 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48 (bb9f09cd0eed6235cc64b81363d45392abfcd27382ba618570c50d74721f0177)\nbb9f09cd0eed6235cc64b81363d45392abfcd27382ba618570c50d74721f0177\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:13:05 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:05.068 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[9f0ae40e-c206-49de-b27e-f4dacc222e56]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:13:05 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:05.070 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9bec1de5-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:13:05 compute-0 nova_compute[117514]: 2025-10-08 19:13:05.072 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:13:05 compute-0 kernel: tap9bec1de5-80: left promiscuous mode
Oct 08 19:13:05 compute-0 nova_compute[117514]: 2025-10-08 19:13:05.090 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:13:05 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:05.094 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[27da0720-7416-4fb1-9008-346752c560d7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:13:05 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:05.129 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[7b5176bf-26d5-4145-ad21-bf7ef340f614]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:13:05 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:05.130 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[081d857e-60c5-4f5e-9d33-deefbc807eb4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:13:05 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:05.156 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[39aec415-f71d-4efd-9d86-7d5a8f1bcb37]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 145111, 'reachable_time': 28752, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 149349, 'error': None, 'target': 'ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:13:05 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:05.160 28783 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Oct 08 19:13:05 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:05.160 28783 DEBUG oslo.privsep.daemon [-] privsep: reply[45ccd5f6-7262-495c-b2db-6770589a065a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:13:05 compute-0 systemd[1]: run-netns-ovnmeta\x2d9bec1de5\x2d8be3\x2d4df6\x2db90a\x2d943d76fedc48.mount: Deactivated successfully.
Oct 08 19:13:05 compute-0 nova_compute[117514]: 2025-10-08 19:13:05.474 2 DEBUG nova.network.neutron [-] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:13:05 compute-0 nova_compute[117514]: 2025-10-08 19:13:05.492 2 INFO nova.compute.manager [-] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Took 1.22 seconds to deallocate network for instance.
Oct 08 19:13:05 compute-0 nova_compute[117514]: 2025-10-08 19:13:05.534 2 DEBUG oslo_concurrency.lockutils [None req-3c44e3a1-ee25-4bd5-9884-5c405b1d40a3 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:13:05 compute-0 nova_compute[117514]: 2025-10-08 19:13:05.535 2 DEBUG oslo_concurrency.lockutils [None req-3c44e3a1-ee25-4bd5-9884-5c405b1d40a3 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:13:05 compute-0 nova_compute[117514]: 2025-10-08 19:13:05.601 2 DEBUG nova.compute.provider_tree [None req-3c44e3a1-ee25-4bd5-9884-5c405b1d40a3 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 08 19:13:05 compute-0 nova_compute[117514]: 2025-10-08 19:13:05.622 2 DEBUG nova.scheduler.client.report [None req-3c44e3a1-ee25-4bd5-9884-5c405b1d40a3 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 08 19:13:05 compute-0 nova_compute[117514]: 2025-10-08 19:13:05.646 2 DEBUG oslo_concurrency.lockutils [None req-3c44e3a1-ee25-4bd5-9884-5c405b1d40a3 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:13:05 compute-0 nova_compute[117514]: 2025-10-08 19:13:05.677 2 INFO nova.scheduler.client.report [None req-3c44e3a1-ee25-4bd5-9884-5c405b1d40a3 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Deleted allocations for instance 093d721c-61cb-4fd3-b678-7465d8840cc6
Oct 08 19:13:05 compute-0 nova_compute[117514]: 2025-10-08 19:13:05.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:13:05 compute-0 nova_compute[117514]: 2025-10-08 19:13:05.748 2 DEBUG oslo_concurrency.lockutils [None req-3c44e3a1-ee25-4bd5-9884-5c405b1d40a3 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "093d721c-61cb-4fd3-b678-7465d8840cc6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.860s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:13:06 compute-0 nova_compute[117514]: 2025-10-08 19:13:06.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:13:07 compute-0 nova_compute[117514]: 2025-10-08 19:13:07.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:13:07 compute-0 nova_compute[117514]: 2025-10-08 19:13:07.739 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:13:07 compute-0 nova_compute[117514]: 2025-10-08 19:13:07.739 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:13:07 compute-0 nova_compute[117514]: 2025-10-08 19:13:07.740 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:13:07 compute-0 nova_compute[117514]: 2025-10-08 19:13:07.740 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 08 19:13:07 compute-0 podman[149351]: 2025-10-08 19:13:07.877843118 +0000 UTC m=+0.085742134 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 08 19:13:07 compute-0 podman[149353]: 2025-10-08 19:13:07.898410582 +0000 UTC m=+0.087305850 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct 08 19:13:07 compute-0 podman[149352]: 2025-10-08 19:13:07.932828065 +0000 UTC m=+0.126308995 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct 08 19:13:07 compute-0 nova_compute[117514]: 2025-10-08 19:13:07.993 2 WARNING nova.virt.libvirt.driver [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 08 19:13:07 compute-0 nova_compute[117514]: 2025-10-08 19:13:07.996 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6069MB free_disk=73.41393280029297GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 08 19:13:07 compute-0 nova_compute[117514]: 2025-10-08 19:13:07.996 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:13:07 compute-0 nova_compute[117514]: 2025-10-08 19:13:07.997 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:13:08 compute-0 nova_compute[117514]: 2025-10-08 19:13:08.061 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 08 19:13:08 compute-0 nova_compute[117514]: 2025-10-08 19:13:08.061 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 08 19:13:08 compute-0 nova_compute[117514]: 2025-10-08 19:13:08.084 2 DEBUG nova.compute.provider_tree [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 08 19:13:08 compute-0 nova_compute[117514]: 2025-10-08 19:13:08.098 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 08 19:13:08 compute-0 nova_compute[117514]: 2025-10-08 19:13:08.120 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 08 19:13:08 compute-0 nova_compute[117514]: 2025-10-08 19:13:08.121 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:13:09 compute-0 nova_compute[117514]: 2025-10-08 19:13:09.212 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:13:10 compute-0 nova_compute[117514]: 2025-10-08 19:13:10.116 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:13:10 compute-0 nova_compute[117514]: 2025-10-08 19:13:10.118 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:13:10 compute-0 nova_compute[117514]: 2025-10-08 19:13:10.118 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 08 19:13:10 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:10.431 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f81f7a-64d8-418a-a74c-b879bd6deb83, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:13:10 compute-0 nova_compute[117514]: 2025-10-08 19:13:10.718 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:13:10 compute-0 nova_compute[117514]: 2025-10-08 19:13:10.719 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 08 19:13:10 compute-0 nova_compute[117514]: 2025-10-08 19:13:10.719 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 08 19:13:10 compute-0 nova_compute[117514]: 2025-10-08 19:13:10.745 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 08 19:13:11 compute-0 nova_compute[117514]: 2025-10-08 19:13:11.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:13:11 compute-0 nova_compute[117514]: 2025-10-08 19:13:11.583 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:13:11 compute-0 nova_compute[117514]: 2025-10-08 19:13:11.672 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:13:11 compute-0 nova_compute[117514]: 2025-10-08 19:13:11.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:13:11 compute-0 nova_compute[117514]: 2025-10-08 19:13:11.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:13:12 compute-0 nova_compute[117514]: 2025-10-08 19:13:12.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:13:13 compute-0 nova_compute[117514]: 2025-10-08 19:13:13.714 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:13:13 compute-0 nova_compute[117514]: 2025-10-08 19:13:13.735 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:13:14 compute-0 nova_compute[117514]: 2025-10-08 19:13:14.214 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:13:16 compute-0 nova_compute[117514]: 2025-10-08 19:13:16.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:13:17 compute-0 podman[149414]: 2025-10-08 19:13:17.626665618 +0000 UTC m=+0.050586070 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 08 19:13:19 compute-0 nova_compute[117514]: 2025-10-08 19:13:19.167 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759950784.165181, 093d721c-61cb-4fd3-b678-7465d8840cc6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 08 19:13:19 compute-0 nova_compute[117514]: 2025-10-08 19:13:19.167 2 INFO nova.compute.manager [-] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] VM Stopped (Lifecycle Event)
Oct 08 19:13:19 compute-0 nova_compute[117514]: 2025-10-08 19:13:19.185 2 DEBUG nova.compute.manager [None req-d26191a6-8b0b-4c65-8c32-a1f4b8dfc372 - - - - - -] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:13:19 compute-0 nova_compute[117514]: 2025-10-08 19:13:19.217 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:13:21 compute-0 nova_compute[117514]: 2025-10-08 19:13:21.238 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:13:24 compute-0 nova_compute[117514]: 2025-10-08 19:13:24.219 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:13:26 compute-0 nova_compute[117514]: 2025-10-08 19:13:26.240 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:13:26 compute-0 nova_compute[117514]: 2025-10-08 19:13:26.501 2 DEBUG oslo_concurrency.lockutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "f6235bae-08b8-41c2-a187-92e12703dc49" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:13:26 compute-0 nova_compute[117514]: 2025-10-08 19:13:26.501 2 DEBUG oslo_concurrency.lockutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "f6235bae-08b8-41c2-a187-92e12703dc49" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:13:26 compute-0 nova_compute[117514]: 2025-10-08 19:13:26.521 2 DEBUG nova.compute.manager [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Oct 08 19:13:26 compute-0 nova_compute[117514]: 2025-10-08 19:13:26.735 2 DEBUG oslo_concurrency.lockutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:13:26 compute-0 nova_compute[117514]: 2025-10-08 19:13:26.736 2 DEBUG oslo_concurrency.lockutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:13:26 compute-0 nova_compute[117514]: 2025-10-08 19:13:26.744 2 DEBUG nova.virt.hardware [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Oct 08 19:13:26 compute-0 nova_compute[117514]: 2025-10-08 19:13:26.744 2 INFO nova.compute.claims [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Claim successful on node compute-0.ctlplane.example.com
Oct 08 19:13:26 compute-0 nova_compute[117514]: 2025-10-08 19:13:26.848 2 DEBUG nova.compute.provider_tree [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 08 19:13:26 compute-0 nova_compute[117514]: 2025-10-08 19:13:26.863 2 DEBUG nova.scheduler.client.report [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 08 19:13:26 compute-0 nova_compute[117514]: 2025-10-08 19:13:26.891 2 DEBUG oslo_concurrency.lockutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:13:26 compute-0 nova_compute[117514]: 2025-10-08 19:13:26.892 2 DEBUG nova.compute.manager [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Oct 08 19:13:26 compute-0 nova_compute[117514]: 2025-10-08 19:13:26.946 2 DEBUG nova.compute.manager [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Oct 08 19:13:26 compute-0 nova_compute[117514]: 2025-10-08 19:13:26.947 2 DEBUG nova.network.neutron [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Oct 08 19:13:26 compute-0 nova_compute[117514]: 2025-10-08 19:13:26.968 2 INFO nova.virt.libvirt.driver [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 08 19:13:26 compute-0 nova_compute[117514]: 2025-10-08 19:13:26.994 2 DEBUG nova.compute.manager [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Oct 08 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.096 2 DEBUG nova.policy [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Oct 08 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.110 2 DEBUG nova.compute.manager [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Oct 08 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.112 2 DEBUG nova.virt.libvirt.driver [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Oct 08 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.113 2 INFO nova.virt.libvirt.driver [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Creating image(s)
Oct 08 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.114 2 DEBUG oslo_concurrency.lockutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "/var/lib/nova/instances/f6235bae-08b8-41c2-a187-92e12703dc49/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.114 2 DEBUG oslo_concurrency.lockutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "/var/lib/nova/instances/f6235bae-08b8-41c2-a187-92e12703dc49/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.116 2 DEBUG oslo_concurrency.lockutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "/var/lib/nova/instances/f6235bae-08b8-41c2-a187-92e12703dc49/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.143 2 DEBUG oslo_concurrency.processutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.229 2 DEBUG oslo_concurrency.processutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.231 2 DEBUG oslo_concurrency.lockutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "008eb3078b811ee47058b7252a820910c35fc6df" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.232 2 DEBUG oslo_concurrency.lockutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.256 2 DEBUG oslo_concurrency.processutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.348 2 DEBUG oslo_concurrency.processutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.350 2 DEBUG oslo_concurrency.processutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df,backing_fmt=raw /var/lib/nova/instances/f6235bae-08b8-41c2-a187-92e12703dc49/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:13:27 compute-0 podman[149449]: 2025-10-08 19:13:27.674553905 +0000 UTC m=+0.086185517 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible)
Oct 08 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.685 2 DEBUG oslo_concurrency.processutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df,backing_fmt=raw /var/lib/nova/instances/f6235bae-08b8-41c2-a187-92e12703dc49/disk 1073741824" returned: 0 in 0.335s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.687 2 DEBUG oslo_concurrency.lockutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.455s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.688 2 DEBUG oslo_concurrency.processutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.774 2 DEBUG oslo_concurrency.processutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.775 2 DEBUG nova.virt.disk.api [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Checking if we can resize image /var/lib/nova/instances/f6235bae-08b8-41c2-a187-92e12703dc49/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Oct 08 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.776 2 DEBUG oslo_concurrency.processutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f6235bae-08b8-41c2-a187-92e12703dc49/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.841 2 DEBUG oslo_concurrency.processutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f6235bae-08b8-41c2-a187-92e12703dc49/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.843 2 DEBUG nova.virt.disk.api [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Cannot resize image /var/lib/nova/instances/f6235bae-08b8-41c2-a187-92e12703dc49/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Oct 08 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.844 2 DEBUG nova.objects.instance [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'migration_context' on Instance uuid f6235bae-08b8-41c2-a187-92e12703dc49 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 08 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.863 2 DEBUG nova.virt.libvirt.driver [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Oct 08 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.864 2 DEBUG nova.virt.libvirt.driver [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Ensure instance console log exists: /var/lib/nova/instances/f6235bae-08b8-41c2-a187-92e12703dc49/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Oct 08 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.864 2 DEBUG oslo_concurrency.lockutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.865 2 DEBUG oslo_concurrency.lockutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.865 2 DEBUG oslo_concurrency.lockutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.893 2 DEBUG nova.network.neutron [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Successfully created port: b5212a27-711c-427f-af17-227f961acc42 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Oct 08 19:13:29 compute-0 nova_compute[117514]: 2025-10-08 19:13:29.221 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:13:29 compute-0 nova_compute[117514]: 2025-10-08 19:13:29.478 2 DEBUG nova.network.neutron [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Successfully updated port: b5212a27-711c-427f-af17-227f961acc42 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Oct 08 19:13:29 compute-0 nova_compute[117514]: 2025-10-08 19:13:29.490 2 DEBUG oslo_concurrency.lockutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "refresh_cache-f6235bae-08b8-41c2-a187-92e12703dc49" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:13:29 compute-0 nova_compute[117514]: 2025-10-08 19:13:29.491 2 DEBUG oslo_concurrency.lockutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquired lock "refresh_cache-f6235bae-08b8-41c2-a187-92e12703dc49" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:13:29 compute-0 nova_compute[117514]: 2025-10-08 19:13:29.491 2 DEBUG nova.network.neutron [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 08 19:13:29 compute-0 nova_compute[117514]: 2025-10-08 19:13:29.574 2 DEBUG nova.compute.manager [req-4a0d852f-e4d1-4835-ad03-70c9bbec9e3c req-7f27874f-df74-4e3a-b56f-907055577372 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Received event network-changed-b5212a27-711c-427f-af17-227f961acc42 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:13:29 compute-0 nova_compute[117514]: 2025-10-08 19:13:29.574 2 DEBUG nova.compute.manager [req-4a0d852f-e4d1-4835-ad03-70c9bbec9e3c req-7f27874f-df74-4e3a-b56f-907055577372 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Refreshing instance network info cache due to event network-changed-b5212a27-711c-427f-af17-227f961acc42. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 08 19:13:29 compute-0 nova_compute[117514]: 2025-10-08 19:13:29.575 2 DEBUG oslo_concurrency.lockutils [req-4a0d852f-e4d1-4835-ad03-70c9bbec9e3c req-7f27874f-df74-4e3a-b56f-907055577372 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-f6235bae-08b8-41c2-a187-92e12703dc49" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:13:30 compute-0 nova_compute[117514]: 2025-10-08 19:13:30.158 2 DEBUG nova.network.neutron [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.230 2 DEBUG nova.network.neutron [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Updating instance_info_cache with network_info: [{"id": "b5212a27-711c-427f-af17-227f961acc42", "address": "fa:16:3e:82:d1:87", "network": {"id": "316ecc22-916e-4a30-bb08-c6bd94993bb1", "bridge": "br-int", "label": "tempest-network-smoke--1739360415", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5212a27-71", "ovs_interfaceid": "b5212a27-711c-427f-af17-227f961acc42", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.244 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.293 2 DEBUG oslo_concurrency.lockutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Releasing lock "refresh_cache-f6235bae-08b8-41c2-a187-92e12703dc49" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.293 2 DEBUG nova.compute.manager [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Instance network_info: |[{"id": "b5212a27-711c-427f-af17-227f961acc42", "address": "fa:16:3e:82:d1:87", "network": {"id": "316ecc22-916e-4a30-bb08-c6bd94993bb1", "bridge": "br-int", "label": "tempest-network-smoke--1739360415", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5212a27-71", "ovs_interfaceid": "b5212a27-711c-427f-af17-227f961acc42", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.294 2 DEBUG oslo_concurrency.lockutils [req-4a0d852f-e4d1-4835-ad03-70c9bbec9e3c req-7f27874f-df74-4e3a-b56f-907055577372 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-f6235bae-08b8-41c2-a187-92e12703dc49" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.294 2 DEBUG nova.network.neutron [req-4a0d852f-e4d1-4835-ad03-70c9bbec9e3c req-7f27874f-df74-4e3a-b56f-907055577372 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Refreshing network info cache for port b5212a27-711c-427f-af17-227f961acc42 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.298 2 DEBUG nova.virt.libvirt.driver [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Start _get_guest_xml network_info=[{"id": "b5212a27-711c-427f-af17-227f961acc42", "address": "fa:16:3e:82:d1:87", "network": {"id": "316ecc22-916e-4a30-bb08-c6bd94993bb1", "bridge": "br-int", "label": "tempest-network-smoke--1739360415", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5212a27-71", "ovs_interfaceid": "b5212a27-711c-427f-af17-227f961acc42", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T19:05:11Z,direct_url=<?>,disk_format='qcow2',id=23cfa426-7011-4566-992d-1c7af39f70dd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0776a2a010754884a7b224f3b08ef53b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T19:05:12Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'guest_format': None, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_options': None, 'image_id': '23cfa426-7011-4566-992d-1c7af39f70dd'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.305 2 WARNING nova.virt.libvirt.driver [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.319 2 DEBUG nova.virt.libvirt.host [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.320 2 DEBUG nova.virt.libvirt.host [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.324 2 DEBUG nova.virt.libvirt.host [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.325 2 DEBUG nova.virt.libvirt.host [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.326 2 DEBUG nova.virt.libvirt.driver [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.326 2 DEBUG nova.virt.hardware [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T19:05:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='e8a148fc-4419-4813-98ff-a17e2a95609e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T19:05:11Z,direct_url=<?>,disk_format='qcow2',id=23cfa426-7011-4566-992d-1c7af39f70dd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0776a2a010754884a7b224f3b08ef53b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T19:05:12Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.327 2 DEBUG nova.virt.hardware [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.327 2 DEBUG nova.virt.hardware [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.327 2 DEBUG nova.virt.hardware [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.327 2 DEBUG nova.virt.hardware [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.328 2 DEBUG nova.virt.hardware [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.328 2 DEBUG nova.virt.hardware [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.328 2 DEBUG nova.virt.hardware [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.328 2 DEBUG nova.virt.hardware [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.328 2 DEBUG nova.virt.hardware [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.329 2 DEBUG nova.virt.hardware [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.333 2 DEBUG nova.virt.libvirt.vif [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T19:13:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-240293585',display_name='tempest-TestNetworkBasicOps-server-240293585',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-240293585',id=10,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB/EJhQ2cVwT1bBhqwqz8VJCILEiuVe01OpwaJWr7LJzSA4TSCURQ/KKnNYCEn/1h4DXNQh6VFPnJP6UNtvndekIhyamyZMFdOa7ELSKKJb75n9Ge1ikETCgbfRbvFVTqw==',key_name='tempest-TestNetworkBasicOps-1252364033',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-3qhae7ry',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T19:13:27Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=f6235bae-08b8-41c2-a187-92e12703dc49,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b5212a27-711c-427f-af17-227f961acc42", "address": "fa:16:3e:82:d1:87", "network": {"id": "316ecc22-916e-4a30-bb08-c6bd94993bb1", "bridge": "br-int", "label": "tempest-network-smoke--1739360415", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5212a27-71", "ovs_interfaceid": "b5212a27-711c-427f-af17-227f961acc42", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.334 2 DEBUG nova.network.os_vif_util [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "b5212a27-711c-427f-af17-227f961acc42", "address": "fa:16:3e:82:d1:87", "network": {"id": "316ecc22-916e-4a30-bb08-c6bd94993bb1", "bridge": "br-int", "label": "tempest-network-smoke--1739360415", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5212a27-71", "ovs_interfaceid": "b5212a27-711c-427f-af17-227f961acc42", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.335 2 DEBUG nova.network.os_vif_util [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:82:d1:87,bridge_name='br-int',has_traffic_filtering=True,id=b5212a27-711c-427f-af17-227f961acc42,network=Network(316ecc22-916e-4a30-bb08-c6bd94993bb1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5212a27-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.336 2 DEBUG nova.objects.instance [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'pci_devices' on Instance uuid f6235bae-08b8-41c2-a187-92e12703dc49 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.352 2 DEBUG nova.virt.libvirt.driver [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] End _get_guest_xml xml=<domain type="kvm">
Oct 08 19:13:31 compute-0 nova_compute[117514]:   <uuid>f6235bae-08b8-41c2-a187-92e12703dc49</uuid>
Oct 08 19:13:31 compute-0 nova_compute[117514]:   <name>instance-0000000a</name>
Oct 08 19:13:31 compute-0 nova_compute[117514]:   <memory>131072</memory>
Oct 08 19:13:31 compute-0 nova_compute[117514]:   <vcpu>1</vcpu>
Oct 08 19:13:31 compute-0 nova_compute[117514]:   <metadata>
Oct 08 19:13:31 compute-0 nova_compute[117514]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 08 19:13:31 compute-0 nova_compute[117514]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct 08 19:13:31 compute-0 nova_compute[117514]:       <nova:name>tempest-TestNetworkBasicOps-server-240293585</nova:name>
Oct 08 19:13:31 compute-0 nova_compute[117514]:       <nova:creationTime>2025-10-08 19:13:31</nova:creationTime>
Oct 08 19:13:31 compute-0 nova_compute[117514]:       <nova:flavor name="m1.nano">
Oct 08 19:13:31 compute-0 nova_compute[117514]:         <nova:memory>128</nova:memory>
Oct 08 19:13:31 compute-0 nova_compute[117514]:         <nova:disk>1</nova:disk>
Oct 08 19:13:31 compute-0 nova_compute[117514]:         <nova:swap>0</nova:swap>
Oct 08 19:13:31 compute-0 nova_compute[117514]:         <nova:ephemeral>0</nova:ephemeral>
Oct 08 19:13:31 compute-0 nova_compute[117514]:         <nova:vcpus>1</nova:vcpus>
Oct 08 19:13:31 compute-0 nova_compute[117514]:       </nova:flavor>
Oct 08 19:13:31 compute-0 nova_compute[117514]:       <nova:owner>
Oct 08 19:13:31 compute-0 nova_compute[117514]:         <nova:user uuid="efdb1424acdb478684cdb088b373ba05">tempest-TestNetworkBasicOps-1122149477-project-member</nova:user>
Oct 08 19:13:31 compute-0 nova_compute[117514]:         <nova:project uuid="b7f7c752a9c5498f8eda73e461895ac9">tempest-TestNetworkBasicOps-1122149477</nova:project>
Oct 08 19:13:31 compute-0 nova_compute[117514]:       </nova:owner>
Oct 08 19:13:31 compute-0 nova_compute[117514]:       <nova:root type="image" uuid="23cfa426-7011-4566-992d-1c7af39f70dd"/>
Oct 08 19:13:31 compute-0 nova_compute[117514]:       <nova:ports>
Oct 08 19:13:31 compute-0 nova_compute[117514]:         <nova:port uuid="b5212a27-711c-427f-af17-227f961acc42">
Oct 08 19:13:31 compute-0 nova_compute[117514]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Oct 08 19:13:31 compute-0 nova_compute[117514]:         </nova:port>
Oct 08 19:13:31 compute-0 nova_compute[117514]:       </nova:ports>
Oct 08 19:13:31 compute-0 nova_compute[117514]:     </nova:instance>
Oct 08 19:13:31 compute-0 nova_compute[117514]:   </metadata>
Oct 08 19:13:31 compute-0 nova_compute[117514]:   <sysinfo type="smbios">
Oct 08 19:13:31 compute-0 nova_compute[117514]:     <system>
Oct 08 19:13:31 compute-0 nova_compute[117514]:       <entry name="manufacturer">RDO</entry>
Oct 08 19:13:31 compute-0 nova_compute[117514]:       <entry name="product">OpenStack Compute</entry>
Oct 08 19:13:31 compute-0 nova_compute[117514]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct 08 19:13:31 compute-0 nova_compute[117514]:       <entry name="serial">f6235bae-08b8-41c2-a187-92e12703dc49</entry>
Oct 08 19:13:31 compute-0 nova_compute[117514]:       <entry name="uuid">f6235bae-08b8-41c2-a187-92e12703dc49</entry>
Oct 08 19:13:31 compute-0 nova_compute[117514]:       <entry name="family">Virtual Machine</entry>
Oct 08 19:13:31 compute-0 nova_compute[117514]:     </system>
Oct 08 19:13:31 compute-0 nova_compute[117514]:   </sysinfo>
Oct 08 19:13:31 compute-0 nova_compute[117514]:   <os>
Oct 08 19:13:31 compute-0 nova_compute[117514]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 08 19:13:31 compute-0 nova_compute[117514]:     <boot dev="hd"/>
Oct 08 19:13:31 compute-0 nova_compute[117514]:     <smbios mode="sysinfo"/>
Oct 08 19:13:31 compute-0 nova_compute[117514]:   </os>
Oct 08 19:13:31 compute-0 nova_compute[117514]:   <features>
Oct 08 19:13:31 compute-0 nova_compute[117514]:     <acpi/>
Oct 08 19:13:31 compute-0 nova_compute[117514]:     <apic/>
Oct 08 19:13:31 compute-0 nova_compute[117514]:     <vmcoreinfo/>
Oct 08 19:13:31 compute-0 nova_compute[117514]:   </features>
Oct 08 19:13:31 compute-0 nova_compute[117514]:   <clock offset="utc">
Oct 08 19:13:31 compute-0 nova_compute[117514]:     <timer name="pit" tickpolicy="delay"/>
Oct 08 19:13:31 compute-0 nova_compute[117514]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 08 19:13:31 compute-0 nova_compute[117514]:     <timer name="hpet" present="no"/>
Oct 08 19:13:31 compute-0 nova_compute[117514]:   </clock>
Oct 08 19:13:31 compute-0 nova_compute[117514]:   <cpu mode="host-model" match="exact">
Oct 08 19:13:31 compute-0 nova_compute[117514]:     <topology sockets="1" cores="1" threads="1"/>
Oct 08 19:13:31 compute-0 nova_compute[117514]:   </cpu>
Oct 08 19:13:31 compute-0 nova_compute[117514]:   <devices>
Oct 08 19:13:31 compute-0 nova_compute[117514]:     <disk type="file" device="disk">
Oct 08 19:13:31 compute-0 nova_compute[117514]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 08 19:13:31 compute-0 nova_compute[117514]:       <source file="/var/lib/nova/instances/f6235bae-08b8-41c2-a187-92e12703dc49/disk"/>
Oct 08 19:13:31 compute-0 nova_compute[117514]:       <target dev="vda" bus="virtio"/>
Oct 08 19:13:31 compute-0 nova_compute[117514]:     </disk>
Oct 08 19:13:31 compute-0 nova_compute[117514]:     <disk type="file" device="cdrom">
Oct 08 19:13:31 compute-0 nova_compute[117514]:       <driver name="qemu" type="raw" cache="none"/>
Oct 08 19:13:31 compute-0 nova_compute[117514]:       <source file="/var/lib/nova/instances/f6235bae-08b8-41c2-a187-92e12703dc49/disk.config"/>
Oct 08 19:13:31 compute-0 nova_compute[117514]:       <target dev="sda" bus="sata"/>
Oct 08 19:13:31 compute-0 nova_compute[117514]:     </disk>
Oct 08 19:13:31 compute-0 nova_compute[117514]:     <interface type="ethernet">
Oct 08 19:13:31 compute-0 nova_compute[117514]:       <mac address="fa:16:3e:82:d1:87"/>
Oct 08 19:13:31 compute-0 nova_compute[117514]:       <model type="virtio"/>
Oct 08 19:13:31 compute-0 nova_compute[117514]:       <driver name="vhost" rx_queue_size="512"/>
Oct 08 19:13:31 compute-0 nova_compute[117514]:       <mtu size="1442"/>
Oct 08 19:13:31 compute-0 nova_compute[117514]:       <target dev="tapb5212a27-71"/>
Oct 08 19:13:31 compute-0 nova_compute[117514]:     </interface>
Oct 08 19:13:31 compute-0 nova_compute[117514]:     <serial type="pty">
Oct 08 19:13:31 compute-0 nova_compute[117514]:       <log file="/var/lib/nova/instances/f6235bae-08b8-41c2-a187-92e12703dc49/console.log" append="off"/>
Oct 08 19:13:31 compute-0 nova_compute[117514]:     </serial>
Oct 08 19:13:31 compute-0 nova_compute[117514]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 08 19:13:31 compute-0 nova_compute[117514]:     <video>
Oct 08 19:13:31 compute-0 nova_compute[117514]:       <model type="virtio"/>
Oct 08 19:13:31 compute-0 nova_compute[117514]:     </video>
Oct 08 19:13:31 compute-0 nova_compute[117514]:     <input type="tablet" bus="usb"/>
Oct 08 19:13:31 compute-0 nova_compute[117514]:     <rng model="virtio">
Oct 08 19:13:31 compute-0 nova_compute[117514]:       <backend model="random">/dev/urandom</backend>
Oct 08 19:13:31 compute-0 nova_compute[117514]:     </rng>
Oct 08 19:13:31 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root"/>
Oct 08 19:13:31 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:13:31 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:13:31 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:13:31 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:13:31 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:13:31 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:13:31 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:13:31 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:13:31 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:13:31 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:13:31 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:13:31 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:13:31 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:13:31 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:13:31 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:13:31 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:13:31 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:13:31 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:13:31 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:13:31 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:13:31 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:13:31 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:13:31 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:13:31 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:13:31 compute-0 nova_compute[117514]:     <controller type="usb" index="0"/>
Oct 08 19:13:31 compute-0 nova_compute[117514]:     <memballoon model="virtio">
Oct 08 19:13:31 compute-0 nova_compute[117514]:       <stats period="10"/>
Oct 08 19:13:31 compute-0 nova_compute[117514]:     </memballoon>
Oct 08 19:13:31 compute-0 nova_compute[117514]:   </devices>
Oct 08 19:13:31 compute-0 nova_compute[117514]: </domain>
Oct 08 19:13:31 compute-0 nova_compute[117514]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.353 2 DEBUG nova.compute.manager [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Preparing to wait for external event network-vif-plugged-b5212a27-711c-427f-af17-227f961acc42 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.354 2 DEBUG oslo_concurrency.lockutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "f6235bae-08b8-41c2-a187-92e12703dc49-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.354 2 DEBUG oslo_concurrency.lockutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "f6235bae-08b8-41c2-a187-92e12703dc49-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.354 2 DEBUG oslo_concurrency.lockutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "f6235bae-08b8-41c2-a187-92e12703dc49-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.355 2 DEBUG nova.virt.libvirt.vif [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T19:13:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-240293585',display_name='tempest-TestNetworkBasicOps-server-240293585',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-240293585',id=10,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB/EJhQ2cVwT1bBhqwqz8VJCILEiuVe01OpwaJWr7LJzSA4TSCURQ/KKnNYCEn/1h4DXNQh6VFPnJP6UNtvndekIhyamyZMFdOa7ELSKKJb75n9Ge1ikETCgbfRbvFVTqw==',key_name='tempest-TestNetworkBasicOps-1252364033',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-3qhae7ry',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T19:13:27Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=f6235bae-08b8-41c2-a187-92e12703dc49,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b5212a27-711c-427f-af17-227f961acc42", "address": "fa:16:3e:82:d1:87", "network": {"id": "316ecc22-916e-4a30-bb08-c6bd94993bb1", "bridge": "br-int", "label": "tempest-network-smoke--1739360415", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5212a27-71", "ovs_interfaceid": "b5212a27-711c-427f-af17-227f961acc42", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.355 2 DEBUG nova.network.os_vif_util [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "b5212a27-711c-427f-af17-227f961acc42", "address": "fa:16:3e:82:d1:87", "network": {"id": "316ecc22-916e-4a30-bb08-c6bd94993bb1", "bridge": "br-int", "label": "tempest-network-smoke--1739360415", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5212a27-71", "ovs_interfaceid": "b5212a27-711c-427f-af17-227f961acc42", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.356 2 DEBUG nova.network.os_vif_util [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:82:d1:87,bridge_name='br-int',has_traffic_filtering=True,id=b5212a27-711c-427f-af17-227f961acc42,network=Network(316ecc22-916e-4a30-bb08-c6bd94993bb1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5212a27-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.356 2 DEBUG os_vif [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:d1:87,bridge_name='br-int',has_traffic_filtering=True,id=b5212a27-711c-427f-af17-227f961acc42,network=Network(316ecc22-916e-4a30-bb08-c6bd94993bb1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5212a27-71') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.357 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.357 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.358 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.363 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb5212a27-71, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.364 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb5212a27-71, col_values=(('external_ids', {'iface-id': 'b5212a27-711c-427f-af17-227f961acc42', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:82:d1:87', 'vm-uuid': 'f6235bae-08b8-41c2-a187-92e12703dc49'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.365 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:13:31 compute-0 NetworkManager[1035]: <info>  [1759950811.3669] manager: (tapb5212a27-71): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/74)
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.367 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.382 2 INFO os_vif [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:d1:87,bridge_name='br-int',has_traffic_filtering=True,id=b5212a27-711c-427f-af17-227f961acc42,network=Network(316ecc22-916e-4a30-bb08-c6bd94993bb1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5212a27-71')
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.445 2 DEBUG nova.virt.libvirt.driver [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.445 2 DEBUG nova.virt.libvirt.driver [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.445 2 DEBUG nova.virt.libvirt.driver [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No VIF found with MAC fa:16:3e:82:d1:87, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.446 2 INFO nova.virt.libvirt.driver [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Using config drive
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.698 2 INFO nova.virt.libvirt.driver [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Creating config drive at /var/lib/nova/instances/f6235bae-08b8-41c2-a187-92e12703dc49/disk.config
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.707 2 DEBUG oslo_concurrency.processutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f6235bae-08b8-41c2-a187-92e12703dc49/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpg4dzo948 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.847 2 DEBUG oslo_concurrency.processutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f6235bae-08b8-41c2-a187-92e12703dc49/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpg4dzo948" returned: 0 in 0.140s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:13:31 compute-0 kernel: tapb5212a27-71: entered promiscuous mode
Oct 08 19:13:31 compute-0 NetworkManager[1035]: <info>  [1759950811.9362] manager: (tapb5212a27-71): new Tun device (/org/freedesktop/NetworkManager/Devices/75)
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.936 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:13:31 compute-0 ovn_controller[19759]: 2025-10-08T19:13:31Z|00127|binding|INFO|Claiming lport b5212a27-711c-427f-af17-227f961acc42 for this chassis.
Oct 08 19:13:31 compute-0 ovn_controller[19759]: 2025-10-08T19:13:31Z|00128|binding|INFO|b5212a27-711c-427f-af17-227f961acc42: Claiming fa:16:3e:82:d1:87 10.100.0.7
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.941 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.946 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:13:31 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:31.960 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:82:d1:87 10.100.0.7'], port_security=['fa:16:3e:82:d1:87 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'f6235bae-08b8-41c2-a187-92e12703dc49', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-316ecc22-916e-4a30-bb08-c6bd94993bb1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2739dafe-af3c-4b39-8e6a-f28bb373aed0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=12b6fae3-2fc2-423c-bbb0-b3805950f5b3, chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=b5212a27-711c-427f-af17-227f961acc42) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 08 19:13:31 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:31.963 28643 INFO neutron.agent.ovn.metadata.agent [-] Port b5212a27-711c-427f-af17-227f961acc42 in datapath 316ecc22-916e-4a30-bb08-c6bd94993bb1 bound to our chassis
Oct 08 19:13:31 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:31.964 28643 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 316ecc22-916e-4a30-bb08-c6bd94993bb1
Oct 08 19:13:31 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:31.980 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[b5168e12-f605-451b-8d3c-9f0604717dce]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:13:31 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:31.981 28643 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap316ecc22-91 in ovnmeta-316ecc22-916e-4a30-bb08-c6bd94993bb1 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Oct 08 19:13:31 compute-0 systemd-udevd[149495]: Network interface NamePolicy= disabled on kernel command line.
Oct 08 19:13:31 compute-0 systemd-machined[77568]: New machine qemu-10-instance-0000000a.
Oct 08 19:13:31 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:31.984 144726 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap316ecc22-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Oct 08 19:13:31 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:31.984 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[e7919a83-a6dc-4d6f-b9f9-daf6091dc6a7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:13:31 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:31.986 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[6d549303-7563-417a-898c-aa9333be2c95]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:13:31 compute-0 NetworkManager[1035]: <info>  [1759950811.9960] device (tapb5212a27-71): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 08 19:13:31 compute-0 NetworkManager[1035]: <info>  [1759950811.9969] device (tapb5212a27-71): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 08 19:13:32 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:32.006 28783 DEBUG oslo.privsep.daemon [-] privsep: reply[28957b67-9abd-4753-806e-dc19d81204a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:13:32 compute-0 systemd[1]: Started Virtual Machine qemu-10-instance-0000000a.
Oct 08 19:13:32 compute-0 nova_compute[117514]: 2025-10-08 19:13:32.014 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:13:32 compute-0 ovn_controller[19759]: 2025-10-08T19:13:32Z|00129|binding|INFO|Setting lport b5212a27-711c-427f-af17-227f961acc42 ovn-installed in OVS
Oct 08 19:13:32 compute-0 ovn_controller[19759]: 2025-10-08T19:13:32Z|00130|binding|INFO|Setting lport b5212a27-711c-427f-af17-227f961acc42 up in Southbound
Oct 08 19:13:32 compute-0 nova_compute[117514]: 2025-10-08 19:13:32.021 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:13:32 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:32.024 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[3b1f1e7f-8761-4d8e-82ca-4e2f64f2c72b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:13:32 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:32.054 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[2e860b5b-2f9b-4546-8934-7651b47012b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:13:32 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:32.061 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[46844e62-bbdd-4545-920e-bedbe76dfafe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:13:32 compute-0 systemd-udevd[149498]: Network interface NamePolicy= disabled on kernel command line.
Oct 08 19:13:32 compute-0 NetworkManager[1035]: <info>  [1759950812.0631] manager: (tap316ecc22-90): new Veth device (/org/freedesktop/NetworkManager/Devices/76)
Oct 08 19:13:32 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:32.103 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[9723aa1a-c796-43b5-b258-a5e39211c8ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:13:32 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:32.107 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[73711c02-85fb-435d-95f1-391570010264]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:13:32 compute-0 NetworkManager[1035]: <info>  [1759950812.1307] device (tap316ecc22-90): carrier: link connected
Oct 08 19:13:32 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:32.137 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[afba7f7e-b1ad-477e-8a7f-da178b4512fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:13:32 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:32.158 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[1754e981-c776-4939-9e67-8b82a84488d9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap316ecc22-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ef:18:46'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 47], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 148271, 'reachable_time': 43666, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 149527, 'error': None, 'target': 'ovnmeta-316ecc22-916e-4a30-bb08-c6bd94993bb1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:13:32 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:32.176 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[a3f6c09f-a36a-4930-b697-c9ae342d719b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feef:1846'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 148271, 'tstamp': 148271}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 149528, 'error': None, 'target': 'ovnmeta-316ecc22-916e-4a30-bb08-c6bd94993bb1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:13:32 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:32.195 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[26de5419-c475-4051-a069-f624905fc9fe]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap316ecc22-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ef:18:46'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 47], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 148271, 'reachable_time': 43666, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 149529, 'error': None, 'target': 'ovnmeta-316ecc22-916e-4a30-bb08-c6bd94993bb1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:13:32 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:32.233 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[87ba2427-1fec-476f-9184-a1dbdb20687b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:13:32 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:32.311 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[9b236615-146f-4831-976a-7d081af421b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:13:32 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:32.313 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap316ecc22-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:13:32 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:32.314 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 08 19:13:32 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:32.314 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap316ecc22-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:13:32 compute-0 kernel: tap316ecc22-90: entered promiscuous mode
Oct 08 19:13:32 compute-0 nova_compute[117514]: 2025-10-08 19:13:32.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:13:32 compute-0 NetworkManager[1035]: <info>  [1759950812.3178] manager: (tap316ecc22-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/77)
Oct 08 19:13:32 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:32.320 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap316ecc22-90, col_values=(('external_ids', {'iface-id': '59904509-05c2-48c7-bbf8-0fca2b0d7dd8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:13:32 compute-0 ovn_controller[19759]: 2025-10-08T19:13:32Z|00131|binding|INFO|Releasing lport 59904509-05c2-48c7-bbf8-0fca2b0d7dd8 from this chassis (sb_readonly=0)
Oct 08 19:13:32 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:32.347 28643 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/316ecc22-916e-4a30-bb08-c6bd94993bb1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/316ecc22-916e-4a30-bb08-c6bd94993bb1.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Oct 08 19:13:32 compute-0 nova_compute[117514]: 2025-10-08 19:13:32.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:13:32 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:32.350 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[bb19937d-c438-47ab-914e-29a63cee0078]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:13:32 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:32.351 28643 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct 08 19:13:32 compute-0 ovn_metadata_agent[28637]: global
Oct 08 19:13:32 compute-0 ovn_metadata_agent[28637]:     log         /dev/log local0 debug
Oct 08 19:13:32 compute-0 ovn_metadata_agent[28637]:     log-tag     haproxy-metadata-proxy-316ecc22-916e-4a30-bb08-c6bd94993bb1
Oct 08 19:13:32 compute-0 ovn_metadata_agent[28637]:     user        root
Oct 08 19:13:32 compute-0 ovn_metadata_agent[28637]:     group       root
Oct 08 19:13:32 compute-0 ovn_metadata_agent[28637]:     maxconn     1024
Oct 08 19:13:32 compute-0 ovn_metadata_agent[28637]:     pidfile     /var/lib/neutron/external/pids/316ecc22-916e-4a30-bb08-c6bd94993bb1.pid.haproxy
Oct 08 19:13:32 compute-0 ovn_metadata_agent[28637]:     daemon
Oct 08 19:13:32 compute-0 ovn_metadata_agent[28637]: 
Oct 08 19:13:32 compute-0 ovn_metadata_agent[28637]: defaults
Oct 08 19:13:32 compute-0 ovn_metadata_agent[28637]:     log global
Oct 08 19:13:32 compute-0 ovn_metadata_agent[28637]:     mode http
Oct 08 19:13:32 compute-0 ovn_metadata_agent[28637]:     option httplog
Oct 08 19:13:32 compute-0 ovn_metadata_agent[28637]:     option dontlognull
Oct 08 19:13:32 compute-0 ovn_metadata_agent[28637]:     option http-server-close
Oct 08 19:13:32 compute-0 ovn_metadata_agent[28637]:     option forwardfor
Oct 08 19:13:32 compute-0 ovn_metadata_agent[28637]:     retries                 3
Oct 08 19:13:32 compute-0 ovn_metadata_agent[28637]:     timeout http-request    30s
Oct 08 19:13:32 compute-0 ovn_metadata_agent[28637]:     timeout connect         30s
Oct 08 19:13:32 compute-0 ovn_metadata_agent[28637]:     timeout client          32s
Oct 08 19:13:32 compute-0 ovn_metadata_agent[28637]:     timeout server          32s
Oct 08 19:13:32 compute-0 ovn_metadata_agent[28637]:     timeout http-keep-alive 30s
Oct 08 19:13:32 compute-0 ovn_metadata_agent[28637]: 
Oct 08 19:13:32 compute-0 ovn_metadata_agent[28637]: 
Oct 08 19:13:32 compute-0 ovn_metadata_agent[28637]: listen listener
Oct 08 19:13:32 compute-0 ovn_metadata_agent[28637]:     bind 169.254.169.254:80
Oct 08 19:13:32 compute-0 ovn_metadata_agent[28637]:     server metadata /var/lib/neutron/metadata_proxy
Oct 08 19:13:32 compute-0 ovn_metadata_agent[28637]:     http-request add-header X-OVN-Network-ID 316ecc22-916e-4a30-bb08-c6bd94993bb1
Oct 08 19:13:32 compute-0 ovn_metadata_agent[28637]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Oct 08 19:13:32 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:32.351 28643 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-316ecc22-916e-4a30-bb08-c6bd94993bb1', 'env', 'PROCESS_TAG=haproxy-316ecc22-916e-4a30-bb08-c6bd94993bb1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/316ecc22-916e-4a30-bb08-c6bd94993bb1.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Oct 08 19:13:32 compute-0 nova_compute[117514]: 2025-10-08 19:13:32.419 2 DEBUG nova.compute.manager [req-4921b7ed-44da-47d1-906d-185a8d2e332d req-efce0ac2-058c-4cfb-9f3a-2791e7881bc7 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Received event network-vif-plugged-b5212a27-711c-427f-af17-227f961acc42 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:13:32 compute-0 nova_compute[117514]: 2025-10-08 19:13:32.419 2 DEBUG oslo_concurrency.lockutils [req-4921b7ed-44da-47d1-906d-185a8d2e332d req-efce0ac2-058c-4cfb-9f3a-2791e7881bc7 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "f6235bae-08b8-41c2-a187-92e12703dc49-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:13:32 compute-0 nova_compute[117514]: 2025-10-08 19:13:32.419 2 DEBUG oslo_concurrency.lockutils [req-4921b7ed-44da-47d1-906d-185a8d2e332d req-efce0ac2-058c-4cfb-9f3a-2791e7881bc7 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "f6235bae-08b8-41c2-a187-92e12703dc49-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:13:32 compute-0 nova_compute[117514]: 2025-10-08 19:13:32.419 2 DEBUG oslo_concurrency.lockutils [req-4921b7ed-44da-47d1-906d-185a8d2e332d req-efce0ac2-058c-4cfb-9f3a-2791e7881bc7 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "f6235bae-08b8-41c2-a187-92e12703dc49-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:13:32 compute-0 nova_compute[117514]: 2025-10-08 19:13:32.420 2 DEBUG nova.compute.manager [req-4921b7ed-44da-47d1-906d-185a8d2e332d req-efce0ac2-058c-4cfb-9f3a-2791e7881bc7 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Processing event network-vif-plugged-b5212a27-711c-427f-af17-227f961acc42 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Oct 08 19:13:32 compute-0 podman[149568]: 2025-10-08 19:13:32.741325445 +0000 UTC m=+0.023324414 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7
Oct 08 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.090 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950813.0901656, f6235bae-08b8-41c2-a187-92e12703dc49 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 08 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.091 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] VM Started (Lifecycle Event)
Oct 08 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.095 2 DEBUG nova.compute.manager [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Oct 08 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.100 2 DEBUG nova.virt.libvirt.driver [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Oct 08 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.104 2 INFO nova.virt.libvirt.driver [-] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Instance spawned successfully.
Oct 08 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.104 2 DEBUG nova.virt.libvirt.driver [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Oct 08 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.124 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.127 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 08 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.141 2 DEBUG nova.virt.libvirt.driver [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.142 2 DEBUG nova.virt.libvirt.driver [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.142 2 DEBUG nova.virt.libvirt.driver [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.143 2 DEBUG nova.virt.libvirt.driver [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.144 2 DEBUG nova.virt.libvirt.driver [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.144 2 DEBUG nova.virt.libvirt.driver [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.155 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 08 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.155 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950813.0903497, f6235bae-08b8-41c2-a187-92e12703dc49 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 08 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.156 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] VM Paused (Lifecycle Event)
Oct 08 19:13:33 compute-0 podman[149568]: 2025-10-08 19:13:33.183319686 +0000 UTC m=+0.465318655 container create 194bb928d5f6ef3def4652af7e3fb08edd40a6cbd6283d5f12dfbb88c236f9cf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-316ecc22-916e-4a30-bb08-c6bd94993bb1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Oct 08 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.200 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.204 2 DEBUG nova.network.neutron [req-4a0d852f-e4d1-4835-ad03-70c9bbec9e3c req-7f27874f-df74-4e3a-b56f-907055577372 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Updated VIF entry in instance network info cache for port b5212a27-711c-427f-af17-227f961acc42. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 08 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.204 2 DEBUG nova.network.neutron [req-4a0d852f-e4d1-4835-ad03-70c9bbec9e3c req-7f27874f-df74-4e3a-b56f-907055577372 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Updating instance_info_cache with network_info: [{"id": "b5212a27-711c-427f-af17-227f961acc42", "address": "fa:16:3e:82:d1:87", "network": {"id": "316ecc22-916e-4a30-bb08-c6bd94993bb1", "bridge": "br-int", "label": "tempest-network-smoke--1739360415", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5212a27-71", "ovs_interfaceid": "b5212a27-711c-427f-af17-227f961acc42", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.208 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950813.0992303, f6235bae-08b8-41c2-a187-92e12703dc49 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 08 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.209 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] VM Resumed (Lifecycle Event)
Oct 08 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.217 2 INFO nova.compute.manager [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Took 6.11 seconds to spawn the instance on the hypervisor.
Oct 08 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.218 2 DEBUG nova.compute.manager [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.225 2 DEBUG oslo_concurrency.lockutils [req-4a0d852f-e4d1-4835-ad03-70c9bbec9e3c req-7f27874f-df74-4e3a-b56f-907055577372 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-f6235bae-08b8-41c2-a187-92e12703dc49" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.234 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.245 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 08 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.272 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 08 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.301 2 INFO nova.compute.manager [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Took 6.72 seconds to build instance.
Oct 08 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.319 2 DEBUG oslo_concurrency.lockutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "f6235bae-08b8-41c2-a187-92e12703dc49" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.818s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:13:33 compute-0 systemd[1]: Started libpod-conmon-194bb928d5f6ef3def4652af7e3fb08edd40a6cbd6283d5f12dfbb88c236f9cf.scope.
Oct 08 19:13:33 compute-0 systemd[1]: Started libcrun container.
Oct 08 19:13:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3f612388ad5483d09f475d7eb14eac62998c672b7cb73df4597040f3e56b0e25/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 08 19:13:33 compute-0 podman[149568]: 2025-10-08 19:13:33.67457525 +0000 UTC m=+0.956574259 container init 194bb928d5f6ef3def4652af7e3fb08edd40a6cbd6283d5f12dfbb88c236f9cf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-316ecc22-916e-4a30-bb08-c6bd94993bb1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct 08 19:13:33 compute-0 podman[149568]: 2025-10-08 19:13:33.685304049 +0000 UTC m=+0.967303018 container start 194bb928d5f6ef3def4652af7e3fb08edd40a6cbd6283d5f12dfbb88c236f9cf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-316ecc22-916e-4a30-bb08-c6bd94993bb1, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct 08 19:13:33 compute-0 neutron-haproxy-ovnmeta-316ecc22-916e-4a30-bb08-c6bd94993bb1[149610]: [NOTICE]   (149628) : New worker (149630) forked
Oct 08 19:13:33 compute-0 neutron-haproxy-ovnmeta-316ecc22-916e-4a30-bb08-c6bd94993bb1[149610]: [NOTICE]   (149628) : Loading success.
Oct 08 19:13:33 compute-0 podman[149582]: 2025-10-08 19:13:33.96920367 +0000 UTC m=+0.725764050 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=multipathd)
Oct 08 19:13:33 compute-0 podman[149581]: 2025-10-08 19:13:33.987076816 +0000 UTC m=+0.754529460 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, architecture=x86_64, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct 08 19:13:34 compute-0 nova_compute[117514]: 2025-10-08 19:13:34.511 2 DEBUG nova.compute.manager [req-e925bde7-f0a1-4b83-9c32-f037bfe0304e req-db73e1d3-abc2-448d-b80b-e45936085f43 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Received event network-vif-plugged-b5212a27-711c-427f-af17-227f961acc42 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:13:34 compute-0 nova_compute[117514]: 2025-10-08 19:13:34.512 2 DEBUG oslo_concurrency.lockutils [req-e925bde7-f0a1-4b83-9c32-f037bfe0304e req-db73e1d3-abc2-448d-b80b-e45936085f43 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "f6235bae-08b8-41c2-a187-92e12703dc49-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:13:34 compute-0 nova_compute[117514]: 2025-10-08 19:13:34.512 2 DEBUG oslo_concurrency.lockutils [req-e925bde7-f0a1-4b83-9c32-f037bfe0304e req-db73e1d3-abc2-448d-b80b-e45936085f43 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "f6235bae-08b8-41c2-a187-92e12703dc49-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:13:34 compute-0 nova_compute[117514]: 2025-10-08 19:13:34.512 2 DEBUG oslo_concurrency.lockutils [req-e925bde7-f0a1-4b83-9c32-f037bfe0304e req-db73e1d3-abc2-448d-b80b-e45936085f43 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "f6235bae-08b8-41c2-a187-92e12703dc49-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:13:34 compute-0 nova_compute[117514]: 2025-10-08 19:13:34.513 2 DEBUG nova.compute.manager [req-e925bde7-f0a1-4b83-9c32-f037bfe0304e req-db73e1d3-abc2-448d-b80b-e45936085f43 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] No waiting events found dispatching network-vif-plugged-b5212a27-711c-427f-af17-227f961acc42 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 08 19:13:34 compute-0 nova_compute[117514]: 2025-10-08 19:13:34.513 2 WARNING nova.compute.manager [req-e925bde7-f0a1-4b83-9c32-f037bfe0304e req-db73e1d3-abc2-448d-b80b-e45936085f43 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Received unexpected event network-vif-plugged-b5212a27-711c-427f-af17-227f961acc42 for instance with vm_state active and task_state None.
Oct 08 19:13:34 compute-0 podman[149640]: 2025-10-08 19:13:34.652658007 +0000 UTC m=+0.070591307 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 08 19:13:35 compute-0 nova_compute[117514]: 2025-10-08 19:13:35.771 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:13:35 compute-0 NetworkManager[1035]: <info>  [1759950815.7720] manager: (patch-provnet-64c51c9c-a066-44c7-bc3d-9c8bcfc2a465-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/78)
Oct 08 19:13:35 compute-0 NetworkManager[1035]: <info>  [1759950815.7739] manager: (patch-br-int-to-provnet-64c51c9c-a066-44c7-bc3d-9c8bcfc2a465): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/79)
Oct 08 19:13:35 compute-0 ovn_controller[19759]: 2025-10-08T19:13:35Z|00132|binding|INFO|Releasing lport 59904509-05c2-48c7-bbf8-0fca2b0d7dd8 from this chassis (sb_readonly=0)
Oct 08 19:13:35 compute-0 ovn_controller[19759]: 2025-10-08T19:13:35Z|00133|binding|INFO|Releasing lport 59904509-05c2-48c7-bbf8-0fca2b0d7dd8 from this chassis (sb_readonly=0)
Oct 08 19:13:35 compute-0 nova_compute[117514]: 2025-10-08 19:13:35.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:13:35 compute-0 nova_compute[117514]: 2025-10-08 19:13:35.836 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:13:36 compute-0 nova_compute[117514]: 2025-10-08 19:13:36.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:13:36 compute-0 nova_compute[117514]: 2025-10-08 19:13:36.366 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:13:36 compute-0 nova_compute[117514]: 2025-10-08 19:13:36.396 2 DEBUG nova.compute.manager [req-f3f3cb46-4492-4ec8-a3e9-77e64dd3c90c req-ec017432-9bbd-471b-9cca-71dcd9e5f3a1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Received event network-changed-b5212a27-711c-427f-af17-227f961acc42 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:13:36 compute-0 nova_compute[117514]: 2025-10-08 19:13:36.397 2 DEBUG nova.compute.manager [req-f3f3cb46-4492-4ec8-a3e9-77e64dd3c90c req-ec017432-9bbd-471b-9cca-71dcd9e5f3a1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Refreshing instance network info cache due to event network-changed-b5212a27-711c-427f-af17-227f961acc42. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 08 19:13:36 compute-0 nova_compute[117514]: 2025-10-08 19:13:36.397 2 DEBUG oslo_concurrency.lockutils [req-f3f3cb46-4492-4ec8-a3e9-77e64dd3c90c req-ec017432-9bbd-471b-9cca-71dcd9e5f3a1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-f6235bae-08b8-41c2-a187-92e12703dc49" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:13:36 compute-0 nova_compute[117514]: 2025-10-08 19:13:36.397 2 DEBUG oslo_concurrency.lockutils [req-f3f3cb46-4492-4ec8-a3e9-77e64dd3c90c req-ec017432-9bbd-471b-9cca-71dcd9e5f3a1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-f6235bae-08b8-41c2-a187-92e12703dc49" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:13:36 compute-0 nova_compute[117514]: 2025-10-08 19:13:36.397 2 DEBUG nova.network.neutron [req-f3f3cb46-4492-4ec8-a3e9-77e64dd3c90c req-ec017432-9bbd-471b-9cca-71dcd9e5f3a1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Refreshing network info cache for port b5212a27-711c-427f-af17-227f961acc42 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 08 19:13:37 compute-0 nova_compute[117514]: 2025-10-08 19:13:37.283 2 DEBUG nova.network.neutron [req-f3f3cb46-4492-4ec8-a3e9-77e64dd3c90c req-ec017432-9bbd-471b-9cca-71dcd9e5f3a1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Updated VIF entry in instance network info cache for port b5212a27-711c-427f-af17-227f961acc42. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 08 19:13:37 compute-0 nova_compute[117514]: 2025-10-08 19:13:37.284 2 DEBUG nova.network.neutron [req-f3f3cb46-4492-4ec8-a3e9-77e64dd3c90c req-ec017432-9bbd-471b-9cca-71dcd9e5f3a1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Updating instance_info_cache with network_info: [{"id": "b5212a27-711c-427f-af17-227f961acc42", "address": "fa:16:3e:82:d1:87", "network": {"id": "316ecc22-916e-4a30-bb08-c6bd94993bb1", "bridge": "br-int", "label": "tempest-network-smoke--1739360415", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5212a27-71", "ovs_interfaceid": "b5212a27-711c-427f-af17-227f961acc42", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:13:37 compute-0 nova_compute[117514]: 2025-10-08 19:13:37.312 2 DEBUG oslo_concurrency.lockutils [req-f3f3cb46-4492-4ec8-a3e9-77e64dd3c90c req-ec017432-9bbd-471b-9cca-71dcd9e5f3a1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-f6235bae-08b8-41c2-a187-92e12703dc49" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:13:38 compute-0 podman[149668]: 2025-10-08 19:13:38.653602867 +0000 UTC m=+0.060019143 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_metadata_agent)
Oct 08 19:13:38 compute-0 podman[149666]: 2025-10-08 19:13:38.685039514 +0000 UTC m=+0.098867733 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 08 19:13:38 compute-0 podman[149667]: 2025-10-08 19:13:38.717916822 +0000 UTC m=+0.130565327 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct 08 19:13:41 compute-0 nova_compute[117514]: 2025-10-08 19:13:41.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:13:41 compute-0 nova_compute[117514]: 2025-10-08 19:13:41.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:13:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:44.233 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:13:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:44.234 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:13:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:44.235 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:13:46 compute-0 nova_compute[117514]: 2025-10-08 19:13:46.253 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:13:46 compute-0 nova_compute[117514]: 2025-10-08 19:13:46.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:13:47 compute-0 ovn_controller[19759]: 2025-10-08T19:13:47Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:82:d1:87 10.100.0.7
Oct 08 19:13:47 compute-0 ovn_controller[19759]: 2025-10-08T19:13:47Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:82:d1:87 10.100.0.7
Oct 08 19:13:48 compute-0 podman[149745]: 2025-10-08 19:13:48.656670661 +0000 UTC m=+0.065932322 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 08 19:13:51 compute-0 nova_compute[117514]: 2025-10-08 19:13:51.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:13:51 compute-0 nova_compute[117514]: 2025-10-08 19:13:51.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:13:54 compute-0 nova_compute[117514]: 2025-10-08 19:13:54.888 2 INFO nova.compute.manager [None req-aad24999-986b-479b-b17d-e0ea21edde11 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Get console output
Oct 08 19:13:54 compute-0 nova_compute[117514]: 2025-10-08 19:13:54.894 54 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Oct 08 19:13:56 compute-0 nova_compute[117514]: 2025-10-08 19:13:56.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:13:56 compute-0 nova_compute[117514]: 2025-10-08 19:13:56.377 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:13:56 compute-0 ovn_controller[19759]: 2025-10-08T19:13:56Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:82:d1:87 10.100.0.7
Oct 08 19:13:57 compute-0 ovn_controller[19759]: 2025-10-08T19:13:57Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:82:d1:87 10.100.0.7
Oct 08 19:13:58 compute-0 podman[149769]: 2025-10-08 19:13:58.666171822 +0000 UTC m=+0.080726380 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 08 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.349 2 DEBUG nova.compute.manager [req-cdbb648e-890e-421b-9045-dc67e216ff53 req-d556c957-4c99-4c3d-a99f-5e8773a72029 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Received event network-changed-b5212a27-711c-427f-af17-227f961acc42 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.349 2 DEBUG nova.compute.manager [req-cdbb648e-890e-421b-9045-dc67e216ff53 req-d556c957-4c99-4c3d-a99f-5e8773a72029 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Refreshing instance network info cache due to event network-changed-b5212a27-711c-427f-af17-227f961acc42. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 08 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.349 2 DEBUG oslo_concurrency.lockutils [req-cdbb648e-890e-421b-9045-dc67e216ff53 req-d556c957-4c99-4c3d-a99f-5e8773a72029 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-f6235bae-08b8-41c2-a187-92e12703dc49" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.350 2 DEBUG oslo_concurrency.lockutils [req-cdbb648e-890e-421b-9045-dc67e216ff53 req-d556c957-4c99-4c3d-a99f-5e8773a72029 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-f6235bae-08b8-41c2-a187-92e12703dc49" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.350 2 DEBUG nova.network.neutron [req-cdbb648e-890e-421b-9045-dc67e216ff53 req-d556c957-4c99-4c3d-a99f-5e8773a72029 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Refreshing network info cache for port b5212a27-711c-427f-af17-227f961acc42 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 08 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.433 2 DEBUG oslo_concurrency.lockutils [None req-fe12152d-c787-45a2-9941-5e205e077fcc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "f6235bae-08b8-41c2-a187-92e12703dc49" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.434 2 DEBUG oslo_concurrency.lockutils [None req-fe12152d-c787-45a2-9941-5e205e077fcc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "f6235bae-08b8-41c2-a187-92e12703dc49" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.434 2 DEBUG oslo_concurrency.lockutils [None req-fe12152d-c787-45a2-9941-5e205e077fcc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "f6235bae-08b8-41c2-a187-92e12703dc49-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.434 2 DEBUG oslo_concurrency.lockutils [None req-fe12152d-c787-45a2-9941-5e205e077fcc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "f6235bae-08b8-41c2-a187-92e12703dc49-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.434 2 DEBUG oslo_concurrency.lockutils [None req-fe12152d-c787-45a2-9941-5e205e077fcc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "f6235bae-08b8-41c2-a187-92e12703dc49-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.435 2 INFO nova.compute.manager [None req-fe12152d-c787-45a2-9941-5e205e077fcc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Terminating instance
Oct 08 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.436 2 DEBUG nova.compute.manager [None req-fe12152d-c787-45a2-9941-5e205e077fcc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Oct 08 19:14:00 compute-0 kernel: tapb5212a27-71 (unregistering): left promiscuous mode
Oct 08 19:14:00 compute-0 NetworkManager[1035]: <info>  [1759950840.4686] device (tapb5212a27-71): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 08 19:14:00 compute-0 ovn_controller[19759]: 2025-10-08T19:14:00Z|00134|binding|INFO|Releasing lport b5212a27-711c-427f-af17-227f961acc42 from this chassis (sb_readonly=0)
Oct 08 19:14:00 compute-0 ovn_controller[19759]: 2025-10-08T19:14:00Z|00135|binding|INFO|Setting lport b5212a27-711c-427f-af17-227f961acc42 down in Southbound
Oct 08 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.479 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:00 compute-0 ovn_controller[19759]: 2025-10-08T19:14:00Z|00136|binding|INFO|Removing iface tapb5212a27-71 ovn-installed in OVS
Oct 08 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:00.493 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:82:d1:87 10.100.0.7'], port_security=['fa:16:3e:82:d1:87 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'f6235bae-08b8-41c2-a187-92e12703dc49', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-316ecc22-916e-4a30-bb08-c6bd94993bb1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2739dafe-af3c-4b39-8e6a-f28bb373aed0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=12b6fae3-2fc2-423c-bbb0-b3805950f5b3, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=b5212a27-711c-427f-af17-227f961acc42) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 08 19:14:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:00.495 28643 INFO neutron.agent.ovn.metadata.agent [-] Port b5212a27-711c-427f-af17-227f961acc42 in datapath 316ecc22-916e-4a30-bb08-c6bd94993bb1 unbound from our chassis
Oct 08 19:14:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:00.497 28643 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 316ecc22-916e-4a30-bb08-c6bd94993bb1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 08 19:14:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:00.499 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[1ee279e4-535c-4ba3-bd05-a367027b5b34]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:14:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:00.500 28643 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-316ecc22-916e-4a30-bb08-c6bd94993bb1 namespace which is not needed anymore
Oct 08 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.505 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:00 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Oct 08 19:14:00 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000a.scope: Consumed 14.019s CPU time.
Oct 08 19:14:00 compute-0 systemd-machined[77568]: Machine qemu-10-instance-0000000a terminated.
Oct 08 19:14:00 compute-0 kernel: tapb5212a27-71: entered promiscuous mode
Oct 08 19:14:00 compute-0 kernel: tapb5212a27-71 (unregistering): left promiscuous mode
Oct 08 19:14:00 compute-0 NetworkManager[1035]: <info>  [1759950840.6705] manager: (tapb5212a27-71): new Tun device (/org/freedesktop/NetworkManager/Devices/80)
Oct 08 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.699 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:00 compute-0 neutron-haproxy-ovnmeta-316ecc22-916e-4a30-bb08-c6bd94993bb1[149610]: [NOTICE]   (149628) : haproxy version is 2.8.14-c23fe91
Oct 08 19:14:00 compute-0 neutron-haproxy-ovnmeta-316ecc22-916e-4a30-bb08-c6bd94993bb1[149610]: [NOTICE]   (149628) : path to executable is /usr/sbin/haproxy
Oct 08 19:14:00 compute-0 neutron-haproxy-ovnmeta-316ecc22-916e-4a30-bb08-c6bd94993bb1[149610]: [WARNING]  (149628) : Exiting Master process...
Oct 08 19:14:00 compute-0 neutron-haproxy-ovnmeta-316ecc22-916e-4a30-bb08-c6bd94993bb1[149610]: [ALERT]    (149628) : Current worker (149630) exited with code 143 (Terminated)
Oct 08 19:14:00 compute-0 neutron-haproxy-ovnmeta-316ecc22-916e-4a30-bb08-c6bd94993bb1[149610]: [WARNING]  (149628) : All workers exited. Exiting... (0)
Oct 08 19:14:00 compute-0 systemd[1]: libpod-194bb928d5f6ef3def4652af7e3fb08edd40a6cbd6283d5f12dfbb88c236f9cf.scope: Deactivated successfully.
Oct 08 19:14:00 compute-0 podman[149813]: 2025-10-08 19:14:00.720128451 +0000 UTC m=+0.098053962 container died 194bb928d5f6ef3def4652af7e3fb08edd40a6cbd6283d5f12dfbb88c236f9cf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-316ecc22-916e-4a30-bb08-c6bd94993bb1, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct 08 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.737 2 INFO nova.virt.libvirt.driver [-] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Instance destroyed successfully.
Oct 08 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.738 2 DEBUG nova.objects.instance [None req-fe12152d-c787-45a2-9941-5e205e077fcc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'resources' on Instance uuid f6235bae-08b8-41c2-a187-92e12703dc49 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 08 19:14:00 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-194bb928d5f6ef3def4652af7e3fb08edd40a6cbd6283d5f12dfbb88c236f9cf-userdata-shm.mount: Deactivated successfully.
Oct 08 19:14:00 compute-0 systemd[1]: var-lib-containers-storage-overlay-3f612388ad5483d09f475d7eb14eac62998c672b7cb73df4597040f3e56b0e25-merged.mount: Deactivated successfully.
Oct 08 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.759 2 DEBUG nova.virt.libvirt.vif [None req-fe12152d-c787-45a2-9941-5e205e077fcc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T19:13:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-240293585',display_name='tempest-TestNetworkBasicOps-server-240293585',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-240293585',id=10,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB/EJhQ2cVwT1bBhqwqz8VJCILEiuVe01OpwaJWr7LJzSA4TSCURQ/KKnNYCEn/1h4DXNQh6VFPnJP6UNtvndekIhyamyZMFdOa7ELSKKJb75n9Ge1ikETCgbfRbvFVTqw==',key_name='tempest-TestNetworkBasicOps-1252364033',keypairs=<?>,launch_index=0,launched_at=2025-10-08T19:13:33Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-3qhae7ry',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T19:13:33Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=f6235bae-08b8-41c2-a187-92e12703dc49,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b5212a27-711c-427f-af17-227f961acc42", "address": "fa:16:3e:82:d1:87", "network": {"id": "316ecc22-916e-4a30-bb08-c6bd94993bb1", "bridge": "br-int", "label": "tempest-network-smoke--1739360415", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5212a27-71", "ovs_interfaceid": "b5212a27-711c-427f-af17-227f961acc42", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Oct 08 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.759 2 DEBUG nova.network.os_vif_util [None req-fe12152d-c787-45a2-9941-5e205e077fcc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "b5212a27-711c-427f-af17-227f961acc42", "address": "fa:16:3e:82:d1:87", "network": {"id": "316ecc22-916e-4a30-bb08-c6bd94993bb1", "bridge": "br-int", "label": "tempest-network-smoke--1739360415", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5212a27-71", "ovs_interfaceid": "b5212a27-711c-427f-af17-227f961acc42", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 08 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.761 2 DEBUG nova.network.os_vif_util [None req-fe12152d-c787-45a2-9941-5e205e077fcc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:82:d1:87,bridge_name='br-int',has_traffic_filtering=True,id=b5212a27-711c-427f-af17-227f961acc42,network=Network(316ecc22-916e-4a30-bb08-c6bd94993bb1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5212a27-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 08 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.761 2 DEBUG os_vif [None req-fe12152d-c787-45a2-9941-5e205e077fcc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:82:d1:87,bridge_name='br-int',has_traffic_filtering=True,id=b5212a27-711c-427f-af17-227f961acc42,network=Network(316ecc22-916e-4a30-bb08-c6bd94993bb1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5212a27-71') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Oct 08 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.763 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.764 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb5212a27-71, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:14:00 compute-0 podman[149813]: 2025-10-08 19:14:00.765308364 +0000 UTC m=+0.143233875 container cleanup 194bb928d5f6ef3def4652af7e3fb08edd40a6cbd6283d5f12dfbb88c236f9cf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-316ecc22-916e-4a30-bb08-c6bd94993bb1, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Oct 08 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.772 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 08 19:14:00 compute-0 systemd[1]: libpod-conmon-194bb928d5f6ef3def4652af7e3fb08edd40a6cbd6283d5f12dfbb88c236f9cf.scope: Deactivated successfully.
Oct 08 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.775 2 INFO os_vif [None req-fe12152d-c787-45a2-9941-5e205e077fcc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:82:d1:87,bridge_name='br-int',has_traffic_filtering=True,id=b5212a27-711c-427f-af17-227f961acc42,network=Network(316ecc22-916e-4a30-bb08-c6bd94993bb1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5212a27-71')
Oct 08 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.776 2 INFO nova.virt.libvirt.driver [None req-fe12152d-c787-45a2-9941-5e205e077fcc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Deleting instance files /var/lib/nova/instances/f6235bae-08b8-41c2-a187-92e12703dc49_del
Oct 08 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.777 2 INFO nova.virt.libvirt.driver [None req-fe12152d-c787-45a2-9941-5e205e077fcc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Deletion of /var/lib/nova/instances/f6235bae-08b8-41c2-a187-92e12703dc49_del complete
Oct 08 19:14:00 compute-0 podman[149863]: 2025-10-08 19:14:00.828031874 +0000 UTC m=+0.040795948 container remove 194bb928d5f6ef3def4652af7e3fb08edd40a6cbd6283d5f12dfbb88c236f9cf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-316ecc22-916e-4a30-bb08-c6bd94993bb1, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct 08 19:14:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:00.835 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[1ded8c70-223a-42a4-a26c-2bc645880d67]: (4, ('Wed Oct  8 07:14:00 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-316ecc22-916e-4a30-bb08-c6bd94993bb1 (194bb928d5f6ef3def4652af7e3fb08edd40a6cbd6283d5f12dfbb88c236f9cf)\n194bb928d5f6ef3def4652af7e3fb08edd40a6cbd6283d5f12dfbb88c236f9cf\nWed Oct  8 07:14:00 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-316ecc22-916e-4a30-bb08-c6bd94993bb1 (194bb928d5f6ef3def4652af7e3fb08edd40a6cbd6283d5f12dfbb88c236f9cf)\n194bb928d5f6ef3def4652af7e3fb08edd40a6cbd6283d5f12dfbb88c236f9cf\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:14:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:00.837 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[f9102b90-651e-43ac-8779-63df50d9f640]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:14:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:00.841 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap316ecc22-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:14:00 compute-0 kernel: tap316ecc22-90: left promiscuous mode
Oct 08 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.843 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.845 2 INFO nova.compute.manager [None req-fe12152d-c787-45a2-9941-5e205e077fcc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Took 0.41 seconds to destroy the instance on the hypervisor.
Oct 08 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.846 2 DEBUG oslo.service.loopingcall [None req-fe12152d-c787-45a2-9941-5e205e077fcc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Oct 08 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.846 2 DEBUG nova.compute.manager [-] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Oct 08 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.847 2 DEBUG nova.network.neutron [-] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Oct 08 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.867 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:00.870 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[5ca7dfcd-5700-4a1b-b443-b90f994cde7b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:14:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:00.897 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[0af635a4-838d-4667-9a68-d07228dc2f8a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:14:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:00.900 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[47ebcc4f-e523-40bb-aec7-b38d78222353]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:14:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:00.923 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[d34d4e8d-e83c-4121-af93-37012dbac095]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 148262, 'reachable_time': 38978, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 149878, 'error': None, 'target': 'ovnmeta-316ecc22-916e-4a30-bb08-c6bd94993bb1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:14:00 compute-0 systemd[1]: run-netns-ovnmeta\x2d316ecc22\x2d916e\x2d4a30\x2dbb08\x2dc6bd94993bb1.mount: Deactivated successfully.
Oct 08 19:14:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:00.925 28783 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-316ecc22-916e-4a30-bb08-c6bd94993bb1 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Oct 08 19:14:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:00.926 28783 DEBUG oslo.privsep.daemon [-] privsep: reply[334101b8-2385-46ba-a202-cd32cacbaa65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:14:01 compute-0 nova_compute[117514]: 2025-10-08 19:14:01.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:02 compute-0 nova_compute[117514]: 2025-10-08 19:14:02.532 2 DEBUG nova.compute.manager [req-934b2f61-9e22-422d-950f-1d2870467692 req-ceaf5385-a2c1-4505-ab7b-1798a0a6ade2 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Received event network-vif-unplugged-b5212a27-711c-427f-af17-227f961acc42 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:14:02 compute-0 nova_compute[117514]: 2025-10-08 19:14:02.532 2 DEBUG oslo_concurrency.lockutils [req-934b2f61-9e22-422d-950f-1d2870467692 req-ceaf5385-a2c1-4505-ab7b-1798a0a6ade2 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "f6235bae-08b8-41c2-a187-92e12703dc49-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:14:02 compute-0 nova_compute[117514]: 2025-10-08 19:14:02.533 2 DEBUG oslo_concurrency.lockutils [req-934b2f61-9e22-422d-950f-1d2870467692 req-ceaf5385-a2c1-4505-ab7b-1798a0a6ade2 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "f6235bae-08b8-41c2-a187-92e12703dc49-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:14:02 compute-0 nova_compute[117514]: 2025-10-08 19:14:02.533 2 DEBUG oslo_concurrency.lockutils [req-934b2f61-9e22-422d-950f-1d2870467692 req-ceaf5385-a2c1-4505-ab7b-1798a0a6ade2 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "f6235bae-08b8-41c2-a187-92e12703dc49-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:14:02 compute-0 nova_compute[117514]: 2025-10-08 19:14:02.533 2 DEBUG nova.compute.manager [req-934b2f61-9e22-422d-950f-1d2870467692 req-ceaf5385-a2c1-4505-ab7b-1798a0a6ade2 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] No waiting events found dispatching network-vif-unplugged-b5212a27-711c-427f-af17-227f961acc42 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 08 19:14:02 compute-0 nova_compute[117514]: 2025-10-08 19:14:02.534 2 DEBUG nova.compute.manager [req-934b2f61-9e22-422d-950f-1d2870467692 req-ceaf5385-a2c1-4505-ab7b-1798a0a6ade2 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Received event network-vif-unplugged-b5212a27-711c-427f-af17-227f961acc42 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Oct 08 19:14:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:03.181 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a6:75:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '5e:14:dd:63:55:2a'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 08 19:14:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:03.182 28643 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 08 19:14:03 compute-0 nova_compute[117514]: 2025-10-08 19:14:03.182 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:03 compute-0 nova_compute[117514]: 2025-10-08 19:14:03.386 2 DEBUG nova.network.neutron [-] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:14:03 compute-0 nova_compute[117514]: 2025-10-08 19:14:03.408 2 INFO nova.compute.manager [-] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Took 2.56 seconds to deallocate network for instance.
Oct 08 19:14:03 compute-0 nova_compute[117514]: 2025-10-08 19:14:03.474 2 DEBUG oslo_concurrency.lockutils [None req-fe12152d-c787-45a2-9941-5e205e077fcc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:14:03 compute-0 nova_compute[117514]: 2025-10-08 19:14:03.475 2 DEBUG oslo_concurrency.lockutils [None req-fe12152d-c787-45a2-9941-5e205e077fcc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:14:03 compute-0 nova_compute[117514]: 2025-10-08 19:14:03.541 2 DEBUG nova.compute.provider_tree [None req-fe12152d-c787-45a2-9941-5e205e077fcc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 08 19:14:03 compute-0 nova_compute[117514]: 2025-10-08 19:14:03.559 2 DEBUG nova.scheduler.client.report [None req-fe12152d-c787-45a2-9941-5e205e077fcc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 08 19:14:03 compute-0 nova_compute[117514]: 2025-10-08 19:14:03.580 2 DEBUG oslo_concurrency.lockutils [None req-fe12152d-c787-45a2-9941-5e205e077fcc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:14:03 compute-0 nova_compute[117514]: 2025-10-08 19:14:03.609 2 INFO nova.scheduler.client.report [None req-fe12152d-c787-45a2-9941-5e205e077fcc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Deleted allocations for instance f6235bae-08b8-41c2-a187-92e12703dc49
Oct 08 19:14:03 compute-0 nova_compute[117514]: 2025-10-08 19:14:03.647 2 DEBUG nova.network.neutron [req-cdbb648e-890e-421b-9045-dc67e216ff53 req-d556c957-4c99-4c3d-a99f-5e8773a72029 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Updated VIF entry in instance network info cache for port b5212a27-711c-427f-af17-227f961acc42. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 08 19:14:03 compute-0 nova_compute[117514]: 2025-10-08 19:14:03.648 2 DEBUG nova.network.neutron [req-cdbb648e-890e-421b-9045-dc67e216ff53 req-d556c957-4c99-4c3d-a99f-5e8773a72029 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Updating instance_info_cache with network_info: [{"id": "b5212a27-711c-427f-af17-227f961acc42", "address": "fa:16:3e:82:d1:87", "network": {"id": "316ecc22-916e-4a30-bb08-c6bd94993bb1", "bridge": "br-int", "label": "tempest-network-smoke--1739360415", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "9.8.7.6", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5212a27-71", "ovs_interfaceid": "b5212a27-711c-427f-af17-227f961acc42", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:14:03 compute-0 nova_compute[117514]: 2025-10-08 19:14:03.669 2 DEBUG oslo_concurrency.lockutils [None req-fe12152d-c787-45a2-9941-5e205e077fcc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "f6235bae-08b8-41c2-a187-92e12703dc49" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.235s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:14:03 compute-0 nova_compute[117514]: 2025-10-08 19:14:03.670 2 DEBUG oslo_concurrency.lockutils [req-cdbb648e-890e-421b-9045-dc67e216ff53 req-d556c957-4c99-4c3d-a99f-5e8773a72029 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-f6235bae-08b8-41c2-a187-92e12703dc49" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:14:04 compute-0 nova_compute[117514]: 2025-10-08 19:14:04.611 2 DEBUG nova.compute.manager [req-c2eaa8ad-edd9-4f6a-af4d-edb09245cff8 req-82246bf3-239c-4518-8750-c21451ebc133 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Received event network-vif-plugged-b5212a27-711c-427f-af17-227f961acc42 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:14:04 compute-0 nova_compute[117514]: 2025-10-08 19:14:04.612 2 DEBUG oslo_concurrency.lockutils [req-c2eaa8ad-edd9-4f6a-af4d-edb09245cff8 req-82246bf3-239c-4518-8750-c21451ebc133 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "f6235bae-08b8-41c2-a187-92e12703dc49-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:14:04 compute-0 nova_compute[117514]: 2025-10-08 19:14:04.612 2 DEBUG oslo_concurrency.lockutils [req-c2eaa8ad-edd9-4f6a-af4d-edb09245cff8 req-82246bf3-239c-4518-8750-c21451ebc133 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "f6235bae-08b8-41c2-a187-92e12703dc49-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:14:04 compute-0 nova_compute[117514]: 2025-10-08 19:14:04.612 2 DEBUG oslo_concurrency.lockutils [req-c2eaa8ad-edd9-4f6a-af4d-edb09245cff8 req-82246bf3-239c-4518-8750-c21451ebc133 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "f6235bae-08b8-41c2-a187-92e12703dc49-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:14:04 compute-0 nova_compute[117514]: 2025-10-08 19:14:04.612 2 DEBUG nova.compute.manager [req-c2eaa8ad-edd9-4f6a-af4d-edb09245cff8 req-82246bf3-239c-4518-8750-c21451ebc133 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] No waiting events found dispatching network-vif-plugged-b5212a27-711c-427f-af17-227f961acc42 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 08 19:14:04 compute-0 nova_compute[117514]: 2025-10-08 19:14:04.613 2 WARNING nova.compute.manager [req-c2eaa8ad-edd9-4f6a-af4d-edb09245cff8 req-82246bf3-239c-4518-8750-c21451ebc133 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Received unexpected event network-vif-plugged-b5212a27-711c-427f-af17-227f961acc42 for instance with vm_state deleted and task_state None.
Oct 08 19:14:04 compute-0 nova_compute[117514]: 2025-10-08 19:14:04.613 2 DEBUG nova.compute.manager [req-c2eaa8ad-edd9-4f6a-af4d-edb09245cff8 req-82246bf3-239c-4518-8750-c21451ebc133 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Received event network-vif-deleted-b5212a27-711c-427f-af17-227f961acc42 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:14:04 compute-0 nova_compute[117514]: 2025-10-08 19:14:04.613 2 INFO nova.compute.manager [req-c2eaa8ad-edd9-4f6a-af4d-edb09245cff8 req-82246bf3-239c-4518-8750-c21451ebc133 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Neutron deleted interface b5212a27-711c-427f-af17-227f961acc42; detaching it from the instance and deleting it from the info cache
Oct 08 19:14:04 compute-0 nova_compute[117514]: 2025-10-08 19:14:04.613 2 DEBUG nova.network.neutron [req-c2eaa8ad-edd9-4f6a-af4d-edb09245cff8 req-82246bf3-239c-4518-8750-c21451ebc133 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106
Oct 08 19:14:04 compute-0 nova_compute[117514]: 2025-10-08 19:14:04.616 2 DEBUG nova.compute.manager [req-c2eaa8ad-edd9-4f6a-af4d-edb09245cff8 req-82246bf3-239c-4518-8750-c21451ebc133 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Detach interface failed, port_id=b5212a27-711c-427f-af17-227f961acc42, reason: Instance f6235bae-08b8-41c2-a187-92e12703dc49 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Oct 08 19:14:04 compute-0 podman[149879]: 2025-10-08 19:14:04.695744999 +0000 UTC m=+0.098188764 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.buildah.version=1.33.7, container_name=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, architecture=x86_64, io.openshift.tags=minimal rhel9, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc.)
Oct 08 19:14:04 compute-0 podman[149880]: 2025-10-08 19:14:04.6985504 +0000 UTC m=+0.095444705 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible)
Oct 08 19:14:04 compute-0 podman[149918]: 2025-10-08 19:14:04.800208702 +0000 UTC m=+0.070613678 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 08 19:14:05 compute-0 nova_compute[117514]: 2025-10-08 19:14:05.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:14:05 compute-0 nova_compute[117514]: 2025-10-08 19:14:05.768 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:06 compute-0 nova_compute[117514]: 2025-10-08 19:14:06.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:07 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:07.185 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f81f7a-64d8-418a-a74c-b879bd6deb83, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:14:07 compute-0 nova_compute[117514]: 2025-10-08 19:14:07.444 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:07 compute-0 nova_compute[117514]: 2025-10-08 19:14:07.532 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:14:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:14:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:14:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:14:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:14:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:14:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:14:08.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:14:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:14:08.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:14:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:14:08.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:14:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:14:08.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:14:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:14:08.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:14:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:14:08.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:14:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:14:08.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:14:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:14:08.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:14:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:14:08.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:14:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:14:08.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:14:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:14:08.247 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:14:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:14:08.247 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:14:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:14:08.247 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:14:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:14:08.247 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:14:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:14:08.247 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:14:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:14:08.247 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:14:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:14:08.248 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:14:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:14:08.248 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:14:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:14:08.248 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:14:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:14:08.248 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:14:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:14:08.248 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:14:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:14:08.249 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:14:08 compute-0 nova_compute[117514]: 2025-10-08 19:14:08.718 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:14:08 compute-0 nova_compute[117514]: 2025-10-08 19:14:08.741 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:14:08 compute-0 nova_compute[117514]: 2025-10-08 19:14:08.741 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:14:08 compute-0 nova_compute[117514]: 2025-10-08 19:14:08.742 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:14:08 compute-0 nova_compute[117514]: 2025-10-08 19:14:08.742 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 08 19:14:08 compute-0 podman[149944]: 2025-10-08 19:14:08.862110411 +0000 UTC m=+0.070140104 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, org.label-schema.license=GPLv2)
Oct 08 19:14:08 compute-0 podman[149946]: 2025-10-08 19:14:08.868454514 +0000 UTC m=+0.065197592 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Oct 08 19:14:08 compute-0 podman[149945]: 2025-10-08 19:14:08.95498497 +0000 UTC m=+0.145019224 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 08 19:14:08 compute-0 nova_compute[117514]: 2025-10-08 19:14:08.959 2 WARNING nova.virt.libvirt.driver [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 08 19:14:08 compute-0 nova_compute[117514]: 2025-10-08 19:14:08.961 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6099MB free_disk=73.4137954711914GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 08 19:14:08 compute-0 nova_compute[117514]: 2025-10-08 19:14:08.961 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:14:08 compute-0 nova_compute[117514]: 2025-10-08 19:14:08.962 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:14:09 compute-0 nova_compute[117514]: 2025-10-08 19:14:09.034 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 08 19:14:09 compute-0 nova_compute[117514]: 2025-10-08 19:14:09.034 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 08 19:14:09 compute-0 nova_compute[117514]: 2025-10-08 19:14:09.059 2 DEBUG nova.compute.provider_tree [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 08 19:14:09 compute-0 nova_compute[117514]: 2025-10-08 19:14:09.079 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 08 19:14:09 compute-0 nova_compute[117514]: 2025-10-08 19:14:09.109 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 08 19:14:09 compute-0 nova_compute[117514]: 2025-10-08 19:14:09.109 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.147s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:14:10 compute-0 nova_compute[117514]: 2025-10-08 19:14:10.773 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:11 compute-0 nova_compute[117514]: 2025-10-08 19:14:11.104 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:14:11 compute-0 nova_compute[117514]: 2025-10-08 19:14:11.104 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:14:11 compute-0 nova_compute[117514]: 2025-10-08 19:14:11.105 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 08 19:14:11 compute-0 nova_compute[117514]: 2025-10-08 19:14:11.105 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 08 19:14:11 compute-0 nova_compute[117514]: 2025-10-08 19:14:11.122 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 08 19:14:11 compute-0 nova_compute[117514]: 2025-10-08 19:14:11.122 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:14:11 compute-0 nova_compute[117514]: 2025-10-08 19:14:11.123 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 08 19:14:11 compute-0 nova_compute[117514]: 2025-10-08 19:14:11.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:11 compute-0 nova_compute[117514]: 2025-10-08 19:14:11.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:14:12 compute-0 nova_compute[117514]: 2025-10-08 19:14:12.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:14:14 compute-0 nova_compute[117514]: 2025-10-08 19:14:14.718 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:14:15 compute-0 nova_compute[117514]: 2025-10-08 19:14:15.719 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:14:15 compute-0 nova_compute[117514]: 2025-10-08 19:14:15.735 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759950840.7344065, f6235bae-08b8-41c2-a187-92e12703dc49 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 08 19:14:15 compute-0 nova_compute[117514]: 2025-10-08 19:14:15.736 2 INFO nova.compute.manager [-] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] VM Stopped (Lifecycle Event)
Oct 08 19:14:15 compute-0 nova_compute[117514]: 2025-10-08 19:14:15.757 2 DEBUG nova.compute.manager [None req-529c4b94-2475-4d8f-afb3-35098bdbf50e - - - - - -] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:14:15 compute-0 nova_compute[117514]: 2025-10-08 19:14:15.777 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:16 compute-0 nova_compute[117514]: 2025-10-08 19:14:16.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:19 compute-0 podman[150006]: 2025-10-08 19:14:19.658155254 +0000 UTC m=+0.069623760 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 08 19:14:20 compute-0 nova_compute[117514]: 2025-10-08 19:14:20.781 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:21 compute-0 nova_compute[117514]: 2025-10-08 19:14:21.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:25 compute-0 nova_compute[117514]: 2025-10-08 19:14:25.495 2 DEBUG oslo_concurrency.lockutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:14:25 compute-0 nova_compute[117514]: 2025-10-08 19:14:25.496 2 DEBUG oslo_concurrency.lockutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:14:25 compute-0 nova_compute[117514]: 2025-10-08 19:14:25.515 2 DEBUG nova.compute.manager [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Oct 08 19:14:25 compute-0 nova_compute[117514]: 2025-10-08 19:14:25.607 2 DEBUG oslo_concurrency.lockutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:14:25 compute-0 nova_compute[117514]: 2025-10-08 19:14:25.608 2 DEBUG oslo_concurrency.lockutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:14:25 compute-0 nova_compute[117514]: 2025-10-08 19:14:25.621 2 DEBUG nova.virt.hardware [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Oct 08 19:14:25 compute-0 nova_compute[117514]: 2025-10-08 19:14:25.621 2 INFO nova.compute.claims [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Claim successful on node compute-0.ctlplane.example.com
Oct 08 19:14:25 compute-0 nova_compute[117514]: 2025-10-08 19:14:25.733 2 DEBUG nova.compute.provider_tree [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 08 19:14:25 compute-0 nova_compute[117514]: 2025-10-08 19:14:25.747 2 DEBUG nova.scheduler.client.report [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 08 19:14:25 compute-0 nova_compute[117514]: 2025-10-08 19:14:25.767 2 DEBUG oslo_concurrency.lockutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:14:25 compute-0 nova_compute[117514]: 2025-10-08 19:14:25.768 2 DEBUG nova.compute.manager [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Oct 08 19:14:25 compute-0 nova_compute[117514]: 2025-10-08 19:14:25.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:25 compute-0 nova_compute[117514]: 2025-10-08 19:14:25.811 2 DEBUG nova.compute.manager [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Oct 08 19:14:25 compute-0 nova_compute[117514]: 2025-10-08 19:14:25.811 2 DEBUG nova.network.neutron [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Oct 08 19:14:25 compute-0 nova_compute[117514]: 2025-10-08 19:14:25.832 2 INFO nova.virt.libvirt.driver [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 08 19:14:25 compute-0 nova_compute[117514]: 2025-10-08 19:14:25.849 2 DEBUG nova.compute.manager [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Oct 08 19:14:25 compute-0 nova_compute[117514]: 2025-10-08 19:14:25.943 2 DEBUG nova.compute.manager [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Oct 08 19:14:25 compute-0 nova_compute[117514]: 2025-10-08 19:14:25.946 2 DEBUG nova.virt.libvirt.driver [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Oct 08 19:14:25 compute-0 nova_compute[117514]: 2025-10-08 19:14:25.947 2 INFO nova.virt.libvirt.driver [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Creating image(s)
Oct 08 19:14:25 compute-0 nova_compute[117514]: 2025-10-08 19:14:25.948 2 DEBUG oslo_concurrency.lockutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "/var/lib/nova/instances/5e004931-f1db-408c-9f7a-6c6c50c5f8ef/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:14:25 compute-0 nova_compute[117514]: 2025-10-08 19:14:25.949 2 DEBUG oslo_concurrency.lockutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "/var/lib/nova/instances/5e004931-f1db-408c-9f7a-6c6c50c5f8ef/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:14:25 compute-0 nova_compute[117514]: 2025-10-08 19:14:25.950 2 DEBUG oslo_concurrency.lockutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "/var/lib/nova/instances/5e004931-f1db-408c-9f7a-6c6c50c5f8ef/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:14:25 compute-0 nova_compute[117514]: 2025-10-08 19:14:25.975 2 DEBUG oslo_concurrency.processutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:14:26 compute-0 nova_compute[117514]: 2025-10-08 19:14:26.037 2 DEBUG oslo_concurrency.processutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:14:26 compute-0 nova_compute[117514]: 2025-10-08 19:14:26.039 2 DEBUG oslo_concurrency.lockutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "008eb3078b811ee47058b7252a820910c35fc6df" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:14:26 compute-0 nova_compute[117514]: 2025-10-08 19:14:26.040 2 DEBUG oslo_concurrency.lockutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:14:26 compute-0 nova_compute[117514]: 2025-10-08 19:14:26.066 2 DEBUG oslo_concurrency.processutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:14:26 compute-0 nova_compute[117514]: 2025-10-08 19:14:26.128 2 DEBUG oslo_concurrency.processutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:14:26 compute-0 nova_compute[117514]: 2025-10-08 19:14:26.129 2 DEBUG oslo_concurrency.processutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df,backing_fmt=raw /var/lib/nova/instances/5e004931-f1db-408c-9f7a-6c6c50c5f8ef/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:14:26 compute-0 nova_compute[117514]: 2025-10-08 19:14:26.172 2 DEBUG oslo_concurrency.processutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df,backing_fmt=raw /var/lib/nova/instances/5e004931-f1db-408c-9f7a-6c6c50c5f8ef/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:14:26 compute-0 nova_compute[117514]: 2025-10-08 19:14:26.173 2 DEBUG oslo_concurrency.lockutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:14:26 compute-0 nova_compute[117514]: 2025-10-08 19:14:26.174 2 DEBUG oslo_concurrency.processutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:14:26 compute-0 nova_compute[117514]: 2025-10-08 19:14:26.214 2 DEBUG nova.policy [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Oct 08 19:14:26 compute-0 nova_compute[117514]: 2025-10-08 19:14:26.236 2 DEBUG oslo_concurrency.processutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:14:26 compute-0 nova_compute[117514]: 2025-10-08 19:14:26.237 2 DEBUG nova.virt.disk.api [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Checking if we can resize image /var/lib/nova/instances/5e004931-f1db-408c-9f7a-6c6c50c5f8ef/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Oct 08 19:14:26 compute-0 nova_compute[117514]: 2025-10-08 19:14:26.238 2 DEBUG oslo_concurrency.processutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5e004931-f1db-408c-9f7a-6c6c50c5f8ef/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:14:26 compute-0 nova_compute[117514]: 2025-10-08 19:14:26.272 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:26 compute-0 nova_compute[117514]: 2025-10-08 19:14:26.300 2 DEBUG oslo_concurrency.processutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5e004931-f1db-408c-9f7a-6c6c50c5f8ef/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:14:26 compute-0 nova_compute[117514]: 2025-10-08 19:14:26.301 2 DEBUG nova.virt.disk.api [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Cannot resize image /var/lib/nova/instances/5e004931-f1db-408c-9f7a-6c6c50c5f8ef/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Oct 08 19:14:26 compute-0 nova_compute[117514]: 2025-10-08 19:14:26.302 2 DEBUG nova.objects.instance [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'migration_context' on Instance uuid 5e004931-f1db-408c-9f7a-6c6c50c5f8ef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 08 19:14:26 compute-0 nova_compute[117514]: 2025-10-08 19:14:26.317 2 DEBUG nova.virt.libvirt.driver [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Oct 08 19:14:26 compute-0 nova_compute[117514]: 2025-10-08 19:14:26.317 2 DEBUG nova.virt.libvirt.driver [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Ensure instance console log exists: /var/lib/nova/instances/5e004931-f1db-408c-9f7a-6c6c50c5f8ef/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Oct 08 19:14:26 compute-0 nova_compute[117514]: 2025-10-08 19:14:26.318 2 DEBUG oslo_concurrency.lockutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:14:26 compute-0 nova_compute[117514]: 2025-10-08 19:14:26.319 2 DEBUG oslo_concurrency.lockutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:14:26 compute-0 nova_compute[117514]: 2025-10-08 19:14:26.319 2 DEBUG oslo_concurrency.lockutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:14:26 compute-0 nova_compute[117514]: 2025-10-08 19:14:26.830 2 DEBUG nova.network.neutron [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Successfully created port: ae9e7968-10b0-4606-9fa3-c91374cf1cc1 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Oct 08 19:14:27 compute-0 nova_compute[117514]: 2025-10-08 19:14:27.699 2 DEBUG nova.network.neutron [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Successfully updated port: ae9e7968-10b0-4606-9fa3-c91374cf1cc1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Oct 08 19:14:27 compute-0 nova_compute[117514]: 2025-10-08 19:14:27.714 2 DEBUG oslo_concurrency.lockutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "refresh_cache-5e004931-f1db-408c-9f7a-6c6c50c5f8ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:14:27 compute-0 nova_compute[117514]: 2025-10-08 19:14:27.715 2 DEBUG oslo_concurrency.lockutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquired lock "refresh_cache-5e004931-f1db-408c-9f7a-6c6c50c5f8ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:14:27 compute-0 nova_compute[117514]: 2025-10-08 19:14:27.716 2 DEBUG nova.network.neutron [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 08 19:14:27 compute-0 nova_compute[117514]: 2025-10-08 19:14:27.786 2 DEBUG nova.compute.manager [req-36472180-8aae-481f-99e9-324f3cea6893 req-31a141d6-f2b5-494e-b7e4-e116f2a4eb4d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Received event network-changed-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:14:27 compute-0 nova_compute[117514]: 2025-10-08 19:14:27.787 2 DEBUG nova.compute.manager [req-36472180-8aae-481f-99e9-324f3cea6893 req-31a141d6-f2b5-494e-b7e4-e116f2a4eb4d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Refreshing instance network info cache due to event network-changed-ae9e7968-10b0-4606-9fa3-c91374cf1cc1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 08 19:14:27 compute-0 nova_compute[117514]: 2025-10-08 19:14:27.787 2 DEBUG oslo_concurrency.lockutils [req-36472180-8aae-481f-99e9-324f3cea6893 req-31a141d6-f2b5-494e-b7e4-e116f2a4eb4d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-5e004931-f1db-408c-9f7a-6c6c50c5f8ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:14:27 compute-0 nova_compute[117514]: 2025-10-08 19:14:27.861 2 DEBUG nova.network.neutron [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Oct 08 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.640 2 DEBUG nova.network.neutron [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Updating instance_info_cache with network_info: [{"id": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "address": "fa:16:3e:23:50:87", "network": {"id": "6826b0cb-7eaf-4468-bf17-e3c581bfc4ac", "bridge": "br-int", "label": "tempest-network-smoke--1582861562", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae9e7968-10", "ovs_interfaceid": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.658 2 DEBUG oslo_concurrency.lockutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Releasing lock "refresh_cache-5e004931-f1db-408c-9f7a-6c6c50c5f8ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.659 2 DEBUG nova.compute.manager [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Instance network_info: |[{"id": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "address": "fa:16:3e:23:50:87", "network": {"id": "6826b0cb-7eaf-4468-bf17-e3c581bfc4ac", "bridge": "br-int", "label": "tempest-network-smoke--1582861562", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae9e7968-10", "ovs_interfaceid": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Oct 08 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.660 2 DEBUG oslo_concurrency.lockutils [req-36472180-8aae-481f-99e9-324f3cea6893 req-31a141d6-f2b5-494e-b7e4-e116f2a4eb4d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-5e004931-f1db-408c-9f7a-6c6c50c5f8ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.660 2 DEBUG nova.network.neutron [req-36472180-8aae-481f-99e9-324f3cea6893 req-31a141d6-f2b5-494e-b7e4-e116f2a4eb4d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Refreshing network info cache for port ae9e7968-10b0-4606-9fa3-c91374cf1cc1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 08 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.664 2 DEBUG nova.virt.libvirt.driver [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Start _get_guest_xml network_info=[{"id": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "address": "fa:16:3e:23:50:87", "network": {"id": "6826b0cb-7eaf-4468-bf17-e3c581bfc4ac", "bridge": "br-int", "label": "tempest-network-smoke--1582861562", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae9e7968-10", "ovs_interfaceid": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T19:05:11Z,direct_url=<?>,disk_format='qcow2',id=23cfa426-7011-4566-992d-1c7af39f70dd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0776a2a010754884a7b224f3b08ef53b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T19:05:12Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'guest_format': None, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_options': None, 'image_id': '23cfa426-7011-4566-992d-1c7af39f70dd'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Oct 08 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.671 2 WARNING nova.virt.libvirt.driver [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 08 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.679 2 DEBUG nova.virt.libvirt.host [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Oct 08 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.680 2 DEBUG nova.virt.libvirt.host [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Oct 08 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.684 2 DEBUG nova.virt.libvirt.host [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Oct 08 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.685 2 DEBUG nova.virt.libvirt.host [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Oct 08 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.686 2 DEBUG nova.virt.libvirt.driver [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Oct 08 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.686 2 DEBUG nova.virt.hardware [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T19:05:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='e8a148fc-4419-4813-98ff-a17e2a95609e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T19:05:11Z,direct_url=<?>,disk_format='qcow2',id=23cfa426-7011-4566-992d-1c7af39f70dd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0776a2a010754884a7b224f3b08ef53b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T19:05:12Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Oct 08 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.686 2 DEBUG nova.virt.hardware [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Oct 08 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.687 2 DEBUG nova.virt.hardware [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Oct 08 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.687 2 DEBUG nova.virt.hardware [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Oct 08 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.687 2 DEBUG nova.virt.hardware [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Oct 08 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.688 2 DEBUG nova.virt.hardware [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Oct 08 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.688 2 DEBUG nova.virt.hardware [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Oct 08 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.688 2 DEBUG nova.virt.hardware [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Oct 08 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.688 2 DEBUG nova.virt.hardware [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Oct 08 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.688 2 DEBUG nova.virt.hardware [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Oct 08 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.689 2 DEBUG nova.virt.hardware [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Oct 08 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.692 2 DEBUG nova.virt.libvirt.vif [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T19:14:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-103133275',display_name='tempest-TestNetworkBasicOps-server-103133275',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-103133275',id=11,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI5y6d80fHySET4pCbLeqyj0cyDTZn6hTOGziG7pCiD92qFDw7Uq+y0suIKpGvDK2QOm6VBv2vJI5Io6WjjxteICCSlzmOgxu+CdOrYx2YA1B+bI4ndO5c+cp00qcb4ncw==',key_name='tempest-TestNetworkBasicOps-286586540',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-cyi34c6v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T19:14:25Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=5e004931-f1db-408c-9f7a-6c6c50c5f8ef,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "address": "fa:16:3e:23:50:87", "network": {"id": "6826b0cb-7eaf-4468-bf17-e3c581bfc4ac", "bridge": "br-int", "label": "tempest-network-smoke--1582861562", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae9e7968-10", "ovs_interfaceid": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Oct 08 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.693 2 DEBUG nova.network.os_vif_util [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "address": "fa:16:3e:23:50:87", "network": {"id": "6826b0cb-7eaf-4468-bf17-e3c581bfc4ac", "bridge": "br-int", "label": "tempest-network-smoke--1582861562", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae9e7968-10", "ovs_interfaceid": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 08 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.694 2 DEBUG nova.network.os_vif_util [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:23:50:87,bridge_name='br-int',has_traffic_filtering=True,id=ae9e7968-10b0-4606-9fa3-c91374cf1cc1,network=Network(6826b0cb-7eaf-4468-bf17-e3c581bfc4ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae9e7968-10') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 08 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.695 2 DEBUG nova.objects.instance [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5e004931-f1db-408c-9f7a-6c6c50c5f8ef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 08 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.711 2 DEBUG nova.virt.libvirt.driver [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] End _get_guest_xml xml=<domain type="kvm">
Oct 08 19:14:28 compute-0 nova_compute[117514]:   <uuid>5e004931-f1db-408c-9f7a-6c6c50c5f8ef</uuid>
Oct 08 19:14:28 compute-0 nova_compute[117514]:   <name>instance-0000000b</name>
Oct 08 19:14:28 compute-0 nova_compute[117514]:   <memory>131072</memory>
Oct 08 19:14:28 compute-0 nova_compute[117514]:   <vcpu>1</vcpu>
Oct 08 19:14:28 compute-0 nova_compute[117514]:   <metadata>
Oct 08 19:14:28 compute-0 nova_compute[117514]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 08 19:14:28 compute-0 nova_compute[117514]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct 08 19:14:28 compute-0 nova_compute[117514]:       <nova:name>tempest-TestNetworkBasicOps-server-103133275</nova:name>
Oct 08 19:14:28 compute-0 nova_compute[117514]:       <nova:creationTime>2025-10-08 19:14:28</nova:creationTime>
Oct 08 19:14:28 compute-0 nova_compute[117514]:       <nova:flavor name="m1.nano">
Oct 08 19:14:28 compute-0 nova_compute[117514]:         <nova:memory>128</nova:memory>
Oct 08 19:14:28 compute-0 nova_compute[117514]:         <nova:disk>1</nova:disk>
Oct 08 19:14:28 compute-0 nova_compute[117514]:         <nova:swap>0</nova:swap>
Oct 08 19:14:28 compute-0 nova_compute[117514]:         <nova:ephemeral>0</nova:ephemeral>
Oct 08 19:14:28 compute-0 nova_compute[117514]:         <nova:vcpus>1</nova:vcpus>
Oct 08 19:14:28 compute-0 nova_compute[117514]:       </nova:flavor>
Oct 08 19:14:28 compute-0 nova_compute[117514]:       <nova:owner>
Oct 08 19:14:28 compute-0 nova_compute[117514]:         <nova:user uuid="efdb1424acdb478684cdb088b373ba05">tempest-TestNetworkBasicOps-1122149477-project-member</nova:user>
Oct 08 19:14:28 compute-0 nova_compute[117514]:         <nova:project uuid="b7f7c752a9c5498f8eda73e461895ac9">tempest-TestNetworkBasicOps-1122149477</nova:project>
Oct 08 19:14:28 compute-0 nova_compute[117514]:       </nova:owner>
Oct 08 19:14:28 compute-0 nova_compute[117514]:       <nova:root type="image" uuid="23cfa426-7011-4566-992d-1c7af39f70dd"/>
Oct 08 19:14:28 compute-0 nova_compute[117514]:       <nova:ports>
Oct 08 19:14:28 compute-0 nova_compute[117514]:         <nova:port uuid="ae9e7968-10b0-4606-9fa3-c91374cf1cc1">
Oct 08 19:14:28 compute-0 nova_compute[117514]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Oct 08 19:14:28 compute-0 nova_compute[117514]:         </nova:port>
Oct 08 19:14:28 compute-0 nova_compute[117514]:       </nova:ports>
Oct 08 19:14:28 compute-0 nova_compute[117514]:     </nova:instance>
Oct 08 19:14:28 compute-0 nova_compute[117514]:   </metadata>
Oct 08 19:14:28 compute-0 nova_compute[117514]:   <sysinfo type="smbios">
Oct 08 19:14:28 compute-0 nova_compute[117514]:     <system>
Oct 08 19:14:28 compute-0 nova_compute[117514]:       <entry name="manufacturer">RDO</entry>
Oct 08 19:14:28 compute-0 nova_compute[117514]:       <entry name="product">OpenStack Compute</entry>
Oct 08 19:14:28 compute-0 nova_compute[117514]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct 08 19:14:28 compute-0 nova_compute[117514]:       <entry name="serial">5e004931-f1db-408c-9f7a-6c6c50c5f8ef</entry>
Oct 08 19:14:28 compute-0 nova_compute[117514]:       <entry name="uuid">5e004931-f1db-408c-9f7a-6c6c50c5f8ef</entry>
Oct 08 19:14:28 compute-0 nova_compute[117514]:       <entry name="family">Virtual Machine</entry>
Oct 08 19:14:28 compute-0 nova_compute[117514]:     </system>
Oct 08 19:14:28 compute-0 nova_compute[117514]:   </sysinfo>
Oct 08 19:14:28 compute-0 nova_compute[117514]:   <os>
Oct 08 19:14:28 compute-0 nova_compute[117514]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 08 19:14:28 compute-0 nova_compute[117514]:     <boot dev="hd"/>
Oct 08 19:14:28 compute-0 nova_compute[117514]:     <smbios mode="sysinfo"/>
Oct 08 19:14:28 compute-0 nova_compute[117514]:   </os>
Oct 08 19:14:28 compute-0 nova_compute[117514]:   <features>
Oct 08 19:14:28 compute-0 nova_compute[117514]:     <acpi/>
Oct 08 19:14:28 compute-0 nova_compute[117514]:     <apic/>
Oct 08 19:14:28 compute-0 nova_compute[117514]:     <vmcoreinfo/>
Oct 08 19:14:28 compute-0 nova_compute[117514]:   </features>
Oct 08 19:14:28 compute-0 nova_compute[117514]:   <clock offset="utc">
Oct 08 19:14:28 compute-0 nova_compute[117514]:     <timer name="pit" tickpolicy="delay"/>
Oct 08 19:14:28 compute-0 nova_compute[117514]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 08 19:14:28 compute-0 nova_compute[117514]:     <timer name="hpet" present="no"/>
Oct 08 19:14:28 compute-0 nova_compute[117514]:   </clock>
Oct 08 19:14:28 compute-0 nova_compute[117514]:   <cpu mode="host-model" match="exact">
Oct 08 19:14:28 compute-0 nova_compute[117514]:     <topology sockets="1" cores="1" threads="1"/>
Oct 08 19:14:28 compute-0 nova_compute[117514]:   </cpu>
Oct 08 19:14:28 compute-0 nova_compute[117514]:   <devices>
Oct 08 19:14:28 compute-0 nova_compute[117514]:     <disk type="file" device="disk">
Oct 08 19:14:28 compute-0 nova_compute[117514]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 08 19:14:28 compute-0 nova_compute[117514]:       <source file="/var/lib/nova/instances/5e004931-f1db-408c-9f7a-6c6c50c5f8ef/disk"/>
Oct 08 19:14:28 compute-0 nova_compute[117514]:       <target dev="vda" bus="virtio"/>
Oct 08 19:14:28 compute-0 nova_compute[117514]:     </disk>
Oct 08 19:14:28 compute-0 nova_compute[117514]:     <disk type="file" device="cdrom">
Oct 08 19:14:28 compute-0 nova_compute[117514]:       <driver name="qemu" type="raw" cache="none"/>
Oct 08 19:14:28 compute-0 nova_compute[117514]:       <source file="/var/lib/nova/instances/5e004931-f1db-408c-9f7a-6c6c50c5f8ef/disk.config"/>
Oct 08 19:14:28 compute-0 nova_compute[117514]:       <target dev="sda" bus="sata"/>
Oct 08 19:14:28 compute-0 nova_compute[117514]:     </disk>
Oct 08 19:14:28 compute-0 nova_compute[117514]:     <interface type="ethernet">
Oct 08 19:14:28 compute-0 nova_compute[117514]:       <mac address="fa:16:3e:23:50:87"/>
Oct 08 19:14:28 compute-0 nova_compute[117514]:       <model type="virtio"/>
Oct 08 19:14:28 compute-0 nova_compute[117514]:       <driver name="vhost" rx_queue_size="512"/>
Oct 08 19:14:28 compute-0 nova_compute[117514]:       <mtu size="1442"/>
Oct 08 19:14:28 compute-0 nova_compute[117514]:       <target dev="tapae9e7968-10"/>
Oct 08 19:14:28 compute-0 nova_compute[117514]:     </interface>
Oct 08 19:14:28 compute-0 nova_compute[117514]:     <serial type="pty">
Oct 08 19:14:28 compute-0 nova_compute[117514]:       <log file="/var/lib/nova/instances/5e004931-f1db-408c-9f7a-6c6c50c5f8ef/console.log" append="off"/>
Oct 08 19:14:28 compute-0 nova_compute[117514]:     </serial>
Oct 08 19:14:28 compute-0 nova_compute[117514]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 08 19:14:28 compute-0 nova_compute[117514]:     <video>
Oct 08 19:14:28 compute-0 nova_compute[117514]:       <model type="virtio"/>
Oct 08 19:14:28 compute-0 nova_compute[117514]:     </video>
Oct 08 19:14:28 compute-0 nova_compute[117514]:     <input type="tablet" bus="usb"/>
Oct 08 19:14:28 compute-0 nova_compute[117514]:     <rng model="virtio">
Oct 08 19:14:28 compute-0 nova_compute[117514]:       <backend model="random">/dev/urandom</backend>
Oct 08 19:14:28 compute-0 nova_compute[117514]:     </rng>
Oct 08 19:14:28 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root"/>
Oct 08 19:14:28 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:14:28 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:14:28 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:14:28 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:14:28 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:14:28 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:14:28 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:14:28 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:14:28 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:14:28 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:14:28 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:14:28 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:14:28 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:14:28 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:14:28 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:14:28 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:14:28 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:14:28 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:14:28 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:14:28 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:14:28 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:14:28 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:14:28 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:14:28 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:14:28 compute-0 nova_compute[117514]:     <controller type="usb" index="0"/>
Oct 08 19:14:28 compute-0 nova_compute[117514]:     <memballoon model="virtio">
Oct 08 19:14:28 compute-0 nova_compute[117514]:       <stats period="10"/>
Oct 08 19:14:28 compute-0 nova_compute[117514]:     </memballoon>
Oct 08 19:14:28 compute-0 nova_compute[117514]:   </devices>
Oct 08 19:14:28 compute-0 nova_compute[117514]: </domain>
Oct 08 19:14:28 compute-0 nova_compute[117514]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Oct 08 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.712 2 DEBUG nova.compute.manager [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Preparing to wait for external event network-vif-plugged-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Oct 08 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.713 2 DEBUG oslo_concurrency.lockutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.714 2 DEBUG oslo_concurrency.lockutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.714 2 DEBUG oslo_concurrency.lockutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.716 2 DEBUG nova.virt.libvirt.vif [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T19:14:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-103133275',display_name='tempest-TestNetworkBasicOps-server-103133275',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-103133275',id=11,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI5y6d80fHySET4pCbLeqyj0cyDTZn6hTOGziG7pCiD92qFDw7Uq+y0suIKpGvDK2QOm6VBv2vJI5Io6WjjxteICCSlzmOgxu+CdOrYx2YA1B+bI4ndO5c+cp00qcb4ncw==',key_name='tempest-TestNetworkBasicOps-286586540',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-cyi34c6v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T19:14:25Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=5e004931-f1db-408c-9f7a-6c6c50c5f8ef,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "address": "fa:16:3e:23:50:87", "network": {"id": "6826b0cb-7eaf-4468-bf17-e3c581bfc4ac", "bridge": "br-int", "label": "tempest-network-smoke--1582861562", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae9e7968-10", "ovs_interfaceid": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Oct 08 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.717 2 DEBUG nova.network.os_vif_util [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "address": "fa:16:3e:23:50:87", "network": {"id": "6826b0cb-7eaf-4468-bf17-e3c581bfc4ac", "bridge": "br-int", "label": "tempest-network-smoke--1582861562", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae9e7968-10", "ovs_interfaceid": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 08 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.718 2 DEBUG nova.network.os_vif_util [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:23:50:87,bridge_name='br-int',has_traffic_filtering=True,id=ae9e7968-10b0-4606-9fa3-c91374cf1cc1,network=Network(6826b0cb-7eaf-4468-bf17-e3c581bfc4ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae9e7968-10') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 08 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.719 2 DEBUG os_vif [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:23:50:87,bridge_name='br-int',has_traffic_filtering=True,id=ae9e7968-10b0-4606-9fa3-c91374cf1cc1,network=Network(6826b0cb-7eaf-4468-bf17-e3c581bfc4ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae9e7968-10') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Oct 08 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.721 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.722 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 08 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.727 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.728 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapae9e7968-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.729 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapae9e7968-10, col_values=(('external_ids', {'iface-id': 'ae9e7968-10b0-4606-9fa3-c91374cf1cc1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:23:50:87', 'vm-uuid': '5e004931-f1db-408c-9f7a-6c6c50c5f8ef'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.731 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:28 compute-0 NetworkManager[1035]: <info>  [1759950868.7333] manager: (tapae9e7968-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/81)
Oct 08 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.734 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 08 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.740 2 INFO os_vif [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:23:50:87,bridge_name='br-int',has_traffic_filtering=True,id=ae9e7968-10b0-4606-9fa3-c91374cf1cc1,network=Network(6826b0cb-7eaf-4468-bf17-e3c581bfc4ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae9e7968-10')
Oct 08 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.784 2 DEBUG nova.virt.libvirt.driver [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 08 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.784 2 DEBUG nova.virt.libvirt.driver [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 08 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.785 2 DEBUG nova.virt.libvirt.driver [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No VIF found with MAC fa:16:3e:23:50:87, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Oct 08 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.785 2 INFO nova.virt.libvirt.driver [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Using config drive
Oct 08 19:14:29 compute-0 nova_compute[117514]: 2025-10-08 19:14:29.197 2 INFO nova.virt.libvirt.driver [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Creating config drive at /var/lib/nova/instances/5e004931-f1db-408c-9f7a-6c6c50c5f8ef/disk.config
Oct 08 19:14:29 compute-0 nova_compute[117514]: 2025-10-08 19:14:29.206 2 DEBUG oslo_concurrency.processutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5e004931-f1db-408c-9f7a-6c6c50c5f8ef/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqpekn68q execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:14:29 compute-0 nova_compute[117514]: 2025-10-08 19:14:29.347 2 DEBUG oslo_concurrency.processutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5e004931-f1db-408c-9f7a-6c6c50c5f8ef/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqpekn68q" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:14:29 compute-0 kernel: tapae9e7968-10: entered promiscuous mode
Oct 08 19:14:29 compute-0 NetworkManager[1035]: <info>  [1759950869.4416] manager: (tapae9e7968-10): new Tun device (/org/freedesktop/NetworkManager/Devices/82)
Oct 08 19:14:29 compute-0 ovn_controller[19759]: 2025-10-08T19:14:29Z|00137|binding|INFO|Claiming lport ae9e7968-10b0-4606-9fa3-c91374cf1cc1 for this chassis.
Oct 08 19:14:29 compute-0 ovn_controller[19759]: 2025-10-08T19:14:29Z|00138|binding|INFO|ae9e7968-10b0-4606-9fa3-c91374cf1cc1: Claiming fa:16:3e:23:50:87 10.100.0.4
Oct 08 19:14:29 compute-0 nova_compute[117514]: 2025-10-08 19:14:29.480 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.496 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:23:50:87 10.100.0.4'], port_security=['fa:16:3e:23:50:87 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '5e004931-f1db-408c-9f7a-6c6c50c5f8ef', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c3b607ea-9253-4328-bb00-668338c7a25d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=770536b4-68ae-4751-9b56-96d89b6bc561, chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=ae9e7968-10b0-4606-9fa3-c91374cf1cc1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.497 28643 INFO neutron.agent.ovn.metadata.agent [-] Port ae9e7968-10b0-4606-9fa3-c91374cf1cc1 in datapath 6826b0cb-7eaf-4468-bf17-e3c581bfc4ac bound to our chassis
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.498 28643 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6826b0cb-7eaf-4468-bf17-e3c581bfc4ac
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.512 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[dab5c0f1-1002-4339-939b-ddca8bb83dca]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.513 28643 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6826b0cb-71 in ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.515 144726 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6826b0cb-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.515 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[14f75afb-2221-4e58-a3db-8ab059fe2470]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.516 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[da5f1852-8d17-401e-b67c-00ac3a9911f7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:14:29 compute-0 systemd-udevd[150082]: Network interface NamePolicy= disabled on kernel command line.
Oct 08 19:14:29 compute-0 systemd-machined[77568]: New machine qemu-11-instance-0000000b.
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.542 28783 DEBUG oslo.privsep.daemon [-] privsep: reply[09b5fc4d-b421-4ccc-9c0b-a9e3840240db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:14:29 compute-0 NetworkManager[1035]: <info>  [1759950869.5450] device (tapae9e7968-10): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 08 19:14:29 compute-0 NetworkManager[1035]: <info>  [1759950869.5460] device (tapae9e7968-10): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 08 19:14:29 compute-0 nova_compute[117514]: 2025-10-08 19:14:29.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:29 compute-0 systemd[1]: Started Virtual Machine qemu-11-instance-0000000b.
Oct 08 19:14:29 compute-0 ovn_controller[19759]: 2025-10-08T19:14:29Z|00139|binding|INFO|Setting lport ae9e7968-10b0-4606-9fa3-c91374cf1cc1 ovn-installed in OVS
Oct 08 19:14:29 compute-0 ovn_controller[19759]: 2025-10-08T19:14:29Z|00140|binding|INFO|Setting lport ae9e7968-10b0-4606-9fa3-c91374cf1cc1 up in Southbound
Oct 08 19:14:29 compute-0 nova_compute[117514]: 2025-10-08 19:14:29.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:29 compute-0 nova_compute[117514]: 2025-10-08 19:14:29.562 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.574 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[7ee601ff-344e-4454-9513-b931fac9ae4e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:14:29 compute-0 podman[150058]: 2025-10-08 19:14:29.580693445 +0000 UTC m=+0.131259628 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251001, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.605 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[e6b340e8-a958-4497-b61d-af2518421ff2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.609 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[54f2a5a3-3411-408d-a72b-bf5ee7f3a517]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:14:29 compute-0 systemd-udevd[150086]: Network interface NamePolicy= disabled on kernel command line.
Oct 08 19:14:29 compute-0 NetworkManager[1035]: <info>  [1759950869.6108] manager: (tap6826b0cb-70): new Veth device (/org/freedesktop/NetworkManager/Devices/83)
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.638 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[d28d90df-e434-4487-9d07-4cc62bb62fe1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.645 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[e55ccb9c-8f81-4613-a552-f83ce4ce85fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:14:29 compute-0 NetworkManager[1035]: <info>  [1759950869.6663] device (tap6826b0cb-70): carrier: link connected
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.671 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[149c8ca2-4f37-4dfd-a781-1c449097c71b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.686 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[adec9253-0955-4665-a55d-1a31fde7862a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6826b0cb-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:25:04:2d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 50], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 154024, 'reachable_time': 41558, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 150117, 'error': None, 'target': 'ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.700 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[5485fb2a-5f24-4372-98c0-96b3f85d91bd]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe25:42d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 154024, 'tstamp': 154024}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 150118, 'error': None, 'target': 'ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.721 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[9ca37da0-3434-470b-b8cc-633f448d0cb5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6826b0cb-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:25:04:2d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 50], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 154024, 'reachable_time': 41558, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 150119, 'error': None, 'target': 'ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.769 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[aabd2cbe-15b0-4e12-ad84-36057171c925]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:14:29 compute-0 nova_compute[117514]: 2025-10-08 19:14:29.859 2 DEBUG nova.network.neutron [req-36472180-8aae-481f-99e9-324f3cea6893 req-31a141d6-f2b5-494e-b7e4-e116f2a4eb4d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Updated VIF entry in instance network info cache for port ae9e7968-10b0-4606-9fa3-c91374cf1cc1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 08 19:14:29 compute-0 nova_compute[117514]: 2025-10-08 19:14:29.860 2 DEBUG nova.network.neutron [req-36472180-8aae-481f-99e9-324f3cea6893 req-31a141d6-f2b5-494e-b7e4-e116f2a4eb4d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Updating instance_info_cache with network_info: [{"id": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "address": "fa:16:3e:23:50:87", "network": {"id": "6826b0cb-7eaf-4468-bf17-e3c581bfc4ac", "bridge": "br-int", "label": "tempest-network-smoke--1582861562", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae9e7968-10", "ovs_interfaceid": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.861 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[327bf00e-8458-4923-be66-682d3a7dd979]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.864 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6826b0cb-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.864 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.865 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6826b0cb-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:14:29 compute-0 kernel: tap6826b0cb-70: entered promiscuous mode
Oct 08 19:14:29 compute-0 NetworkManager[1035]: <info>  [1759950869.8679] manager: (tap6826b0cb-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/84)
Oct 08 19:14:29 compute-0 nova_compute[117514]: 2025-10-08 19:14:29.867 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.870 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6826b0cb-70, col_values=(('external_ids', {'iface-id': 'eabc4672-d176-4f11-b5f6-bcbea840c3e8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.872 28643 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6826b0cb-7eaf-4468-bf17-e3c581bfc4ac.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6826b0cb-7eaf-4468-bf17-e3c581bfc4ac.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Oct 08 19:14:29 compute-0 ovn_controller[19759]: 2025-10-08T19:14:29Z|00141|binding|INFO|Releasing lport eabc4672-d176-4f11-b5f6-bcbea840c3e8 from this chassis (sb_readonly=0)
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.875 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[94fe8f8e-8ba8-44b3-a65c-9969f3f13ab2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.876 28643 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]: global
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]:     log         /dev/log local0 debug
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]:     log-tag     haproxy-metadata-proxy-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]:     user        root
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]:     group       root
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]:     maxconn     1024
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]:     pidfile     /var/lib/neutron/external/pids/6826b0cb-7eaf-4468-bf17-e3c581bfc4ac.pid.haproxy
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]:     daemon
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]: 
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]: defaults
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]:     log global
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]:     mode http
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]:     option httplog
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]:     option dontlognull
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]:     option http-server-close
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]:     option forwardfor
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]:     retries                 3
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]:     timeout http-request    30s
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]:     timeout connect         30s
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]:     timeout client          32s
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]:     timeout server          32s
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]:     timeout http-keep-alive 30s
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]: 
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]: 
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]: listen listener
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]:     bind 169.254.169.254:80
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]:     server metadata /var/lib/neutron/metadata_proxy
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]:     http-request add-header X-OVN-Network-ID 6826b0cb-7eaf-4468-bf17-e3c581bfc4ac
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Oct 08 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.878 28643 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac', 'env', 'PROCESS_TAG=haproxy-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6826b0cb-7eaf-4468-bf17-e3c581bfc4ac.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Oct 08 19:14:29 compute-0 nova_compute[117514]: 2025-10-08 19:14:29.883 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:29 compute-0 nova_compute[117514]: 2025-10-08 19:14:29.893 2 DEBUG nova.compute.manager [req-6a1a61a3-502c-4ac5-bcc4-e648c7210557 req-ee88695f-33ad-481f-bc70-bbbc14834775 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Received event network-vif-plugged-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:14:29 compute-0 nova_compute[117514]: 2025-10-08 19:14:29.893 2 DEBUG oslo_concurrency.lockutils [req-6a1a61a3-502c-4ac5-bcc4-e648c7210557 req-ee88695f-33ad-481f-bc70-bbbc14834775 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:14:29 compute-0 nova_compute[117514]: 2025-10-08 19:14:29.895 2 DEBUG oslo_concurrency.lockutils [req-6a1a61a3-502c-4ac5-bcc4-e648c7210557 req-ee88695f-33ad-481f-bc70-bbbc14834775 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:14:29 compute-0 nova_compute[117514]: 2025-10-08 19:14:29.895 2 DEBUG oslo_concurrency.lockutils [req-6a1a61a3-502c-4ac5-bcc4-e648c7210557 req-ee88695f-33ad-481f-bc70-bbbc14834775 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:14:29 compute-0 nova_compute[117514]: 2025-10-08 19:14:29.896 2 DEBUG nova.compute.manager [req-6a1a61a3-502c-4ac5-bcc4-e648c7210557 req-ee88695f-33ad-481f-bc70-bbbc14834775 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Processing event network-vif-plugged-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Oct 08 19:14:29 compute-0 nova_compute[117514]: 2025-10-08 19:14:29.898 2 DEBUG oslo_concurrency.lockutils [req-36472180-8aae-481f-99e9-324f3cea6893 req-31a141d6-f2b5-494e-b7e4-e116f2a4eb4d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-5e004931-f1db-408c-9f7a-6c6c50c5f8ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:14:30 compute-0 podman[150158]: 2025-10-08 19:14:30.328141819 +0000 UTC m=+0.078371382 container create fd1984bbb04b2d37335257684b95c287df155b6e2e004efa2c044877a08a6b2b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 08 19:14:30 compute-0 systemd[1]: Started libpod-conmon-fd1984bbb04b2d37335257684b95c287df155b6e2e004efa2c044877a08a6b2b.scope.
Oct 08 19:14:30 compute-0 podman[150158]: 2025-10-08 19:14:30.28763491 +0000 UTC m=+0.037864564 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7
Oct 08 19:14:30 compute-0 systemd[1]: Started libcrun container.
Oct 08 19:14:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d6837e9d7ec807fa0b782d57371b150b504d3cac3a36af379f52c92234713c15/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 08 19:14:30 compute-0 podman[150158]: 2025-10-08 19:14:30.421152003 +0000 UTC m=+0.171381596 container init fd1984bbb04b2d37335257684b95c287df155b6e2e004efa2c044877a08a6b2b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 08 19:14:30 compute-0 podman[150158]: 2025-10-08 19:14:30.431085479 +0000 UTC m=+0.181315042 container start fd1984bbb04b2d37335257684b95c287df155b6e2e004efa2c044877a08a6b2b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251001)
Oct 08 19:14:30 compute-0 neutron-haproxy-ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac[150173]: [NOTICE]   (150177) : New worker (150179) forked
Oct 08 19:14:30 compute-0 neutron-haproxy-ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac[150173]: [NOTICE]   (150177) : Loading success.
Oct 08 19:14:30 compute-0 nova_compute[117514]: 2025-10-08 19:14:30.498 2 DEBUG nova.compute.manager [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Oct 08 19:14:30 compute-0 nova_compute[117514]: 2025-10-08 19:14:30.499 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950870.4977334, 5e004931-f1db-408c-9f7a-6c6c50c5f8ef => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 08 19:14:30 compute-0 nova_compute[117514]: 2025-10-08 19:14:30.500 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] VM Started (Lifecycle Event)
Oct 08 19:14:30 compute-0 nova_compute[117514]: 2025-10-08 19:14:30.504 2 DEBUG nova.virt.libvirt.driver [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Oct 08 19:14:30 compute-0 nova_compute[117514]: 2025-10-08 19:14:30.508 2 INFO nova.virt.libvirt.driver [-] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Instance spawned successfully.
Oct 08 19:14:30 compute-0 nova_compute[117514]: 2025-10-08 19:14:30.509 2 DEBUG nova.virt.libvirt.driver [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Oct 08 19:14:30 compute-0 nova_compute[117514]: 2025-10-08 19:14:30.536 2 DEBUG nova.virt.libvirt.driver [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:14:30 compute-0 nova_compute[117514]: 2025-10-08 19:14:30.536 2 DEBUG nova.virt.libvirt.driver [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:14:30 compute-0 nova_compute[117514]: 2025-10-08 19:14:30.537 2 DEBUG nova.virt.libvirt.driver [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:14:30 compute-0 nova_compute[117514]: 2025-10-08 19:14:30.537 2 DEBUG nova.virt.libvirt.driver [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:14:30 compute-0 nova_compute[117514]: 2025-10-08 19:14:30.537 2 DEBUG nova.virt.libvirt.driver [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:14:30 compute-0 nova_compute[117514]: 2025-10-08 19:14:30.538 2 DEBUG nova.virt.libvirt.driver [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:14:30 compute-0 nova_compute[117514]: 2025-10-08 19:14:30.542 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:14:30 compute-0 nova_compute[117514]: 2025-10-08 19:14:30.545 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 08 19:14:30 compute-0 nova_compute[117514]: 2025-10-08 19:14:30.578 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 08 19:14:30 compute-0 nova_compute[117514]: 2025-10-08 19:14:30.578 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950870.49909, 5e004931-f1db-408c-9f7a-6c6c50c5f8ef => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 08 19:14:30 compute-0 nova_compute[117514]: 2025-10-08 19:14:30.578 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] VM Paused (Lifecycle Event)
Oct 08 19:14:30 compute-0 nova_compute[117514]: 2025-10-08 19:14:30.608 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:14:30 compute-0 nova_compute[117514]: 2025-10-08 19:14:30.611 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950870.502619, 5e004931-f1db-408c-9f7a-6c6c50c5f8ef => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 08 19:14:30 compute-0 nova_compute[117514]: 2025-10-08 19:14:30.611 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] VM Resumed (Lifecycle Event)
Oct 08 19:14:30 compute-0 nova_compute[117514]: 2025-10-08 19:14:30.615 2 INFO nova.compute.manager [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Took 4.67 seconds to spawn the instance on the hypervisor.
Oct 08 19:14:30 compute-0 nova_compute[117514]: 2025-10-08 19:14:30.616 2 DEBUG nova.compute.manager [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:14:30 compute-0 nova_compute[117514]: 2025-10-08 19:14:30.638 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:14:30 compute-0 nova_compute[117514]: 2025-10-08 19:14:30.641 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 08 19:14:30 compute-0 nova_compute[117514]: 2025-10-08 19:14:30.665 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 08 19:14:30 compute-0 nova_compute[117514]: 2025-10-08 19:14:30.679 2 INFO nova.compute.manager [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Took 5.10 seconds to build instance.
Oct 08 19:14:30 compute-0 nova_compute[117514]: 2025-10-08 19:14:30.693 2 DEBUG oslo_concurrency.lockutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.198s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:14:31 compute-0 nova_compute[117514]: 2025-10-08 19:14:31.304 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:31 compute-0 nova_compute[117514]: 2025-10-08 19:14:31.990 2 DEBUG nova.compute.manager [req-45fa6cb2-377a-403e-98d9-b23aae8f33e8 req-345f1e34-1ccc-4ec6-b238-955c274fdfd6 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Received event network-vif-plugged-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:14:31 compute-0 nova_compute[117514]: 2025-10-08 19:14:31.991 2 DEBUG oslo_concurrency.lockutils [req-45fa6cb2-377a-403e-98d9-b23aae8f33e8 req-345f1e34-1ccc-4ec6-b238-955c274fdfd6 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:14:31 compute-0 nova_compute[117514]: 2025-10-08 19:14:31.991 2 DEBUG oslo_concurrency.lockutils [req-45fa6cb2-377a-403e-98d9-b23aae8f33e8 req-345f1e34-1ccc-4ec6-b238-955c274fdfd6 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:14:31 compute-0 nova_compute[117514]: 2025-10-08 19:14:31.991 2 DEBUG oslo_concurrency.lockutils [req-45fa6cb2-377a-403e-98d9-b23aae8f33e8 req-345f1e34-1ccc-4ec6-b238-955c274fdfd6 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:14:31 compute-0 nova_compute[117514]: 2025-10-08 19:14:31.992 2 DEBUG nova.compute.manager [req-45fa6cb2-377a-403e-98d9-b23aae8f33e8 req-345f1e34-1ccc-4ec6-b238-955c274fdfd6 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] No waiting events found dispatching network-vif-plugged-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 08 19:14:31 compute-0 nova_compute[117514]: 2025-10-08 19:14:31.992 2 WARNING nova.compute.manager [req-45fa6cb2-377a-403e-98d9-b23aae8f33e8 req-345f1e34-1ccc-4ec6-b238-955c274fdfd6 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Received unexpected event network-vif-plugged-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 for instance with vm_state active and task_state None.
Oct 08 19:14:33 compute-0 nova_compute[117514]: 2025-10-08 19:14:33.734 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:35 compute-0 podman[150188]: 2025-10-08 19:14:35.684811102 +0000 UTC m=+0.093136458 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_id=edpm, name=ubi9-minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., architecture=x86_64, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, container_name=openstack_network_exporter)
Oct 08 19:14:35 compute-0 podman[150190]: 2025-10-08 19:14:35.701754041 +0000 UTC m=+0.104986850 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct 08 19:14:35 compute-0 podman[150189]: 2025-10-08 19:14:35.708146925 +0000 UTC m=+0.113322990 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, config_id=multipathd, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 08 19:14:35 compute-0 nova_compute[117514]: 2025-10-08 19:14:35.736 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:35 compute-0 ovn_controller[19759]: 2025-10-08T19:14:35Z|00142|binding|INFO|Releasing lport eabc4672-d176-4f11-b5f6-bcbea840c3e8 from this chassis (sb_readonly=0)
Oct 08 19:14:35 compute-0 NetworkManager[1035]: <info>  [1759950875.7369] manager: (patch-br-int-to-provnet-64c51c9c-a066-44c7-bc3d-9c8bcfc2a465): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/85)
Oct 08 19:14:35 compute-0 NetworkManager[1035]: <info>  [1759950875.7379] manager: (patch-provnet-64c51c9c-a066-44c7-bc3d-9c8bcfc2a465-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/86)
Oct 08 19:14:35 compute-0 ovn_controller[19759]: 2025-10-08T19:14:35Z|00143|binding|INFO|Releasing lport eabc4672-d176-4f11-b5f6-bcbea840c3e8 from this chassis (sb_readonly=0)
Oct 08 19:14:35 compute-0 nova_compute[117514]: 2025-10-08 19:14:35.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:36 compute-0 nova_compute[117514]: 2025-10-08 19:14:35.999 2 DEBUG nova.compute.manager [req-e7b03236-9547-43aa-b203-494b406bfe28 req-fa264db8-af84-44e3-97b4-221c36542ef4 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Received event network-changed-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:14:36 compute-0 nova_compute[117514]: 2025-10-08 19:14:36.001 2 DEBUG nova.compute.manager [req-e7b03236-9547-43aa-b203-494b406bfe28 req-fa264db8-af84-44e3-97b4-221c36542ef4 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Refreshing instance network info cache due to event network-changed-ae9e7968-10b0-4606-9fa3-c91374cf1cc1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 08 19:14:36 compute-0 nova_compute[117514]: 2025-10-08 19:14:36.002 2 DEBUG oslo_concurrency.lockutils [req-e7b03236-9547-43aa-b203-494b406bfe28 req-fa264db8-af84-44e3-97b4-221c36542ef4 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-5e004931-f1db-408c-9f7a-6c6c50c5f8ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:14:36 compute-0 nova_compute[117514]: 2025-10-08 19:14:36.003 2 DEBUG oslo_concurrency.lockutils [req-e7b03236-9547-43aa-b203-494b406bfe28 req-fa264db8-af84-44e3-97b4-221c36542ef4 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-5e004931-f1db-408c-9f7a-6c6c50c5f8ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:14:36 compute-0 nova_compute[117514]: 2025-10-08 19:14:36.004 2 DEBUG nova.network.neutron [req-e7b03236-9547-43aa-b203-494b406bfe28 req-fa264db8-af84-44e3-97b4-221c36542ef4 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Refreshing network info cache for port ae9e7968-10b0-4606-9fa3-c91374cf1cc1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 08 19:14:36 compute-0 nova_compute[117514]: 2025-10-08 19:14:36.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:37 compute-0 nova_compute[117514]: 2025-10-08 19:14:37.231 2 DEBUG nova.network.neutron [req-e7b03236-9547-43aa-b203-494b406bfe28 req-fa264db8-af84-44e3-97b4-221c36542ef4 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Updated VIF entry in instance network info cache for port ae9e7968-10b0-4606-9fa3-c91374cf1cc1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 08 19:14:37 compute-0 nova_compute[117514]: 2025-10-08 19:14:37.233 2 DEBUG nova.network.neutron [req-e7b03236-9547-43aa-b203-494b406bfe28 req-fa264db8-af84-44e3-97b4-221c36542ef4 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Updating instance_info_cache with network_info: [{"id": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "address": "fa:16:3e:23:50:87", "network": {"id": "6826b0cb-7eaf-4468-bf17-e3c581bfc4ac", "bridge": "br-int", "label": "tempest-network-smoke--1582861562", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae9e7968-10", "ovs_interfaceid": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:14:37 compute-0 nova_compute[117514]: 2025-10-08 19:14:37.251 2 DEBUG oslo_concurrency.lockutils [req-e7b03236-9547-43aa-b203-494b406bfe28 req-fa264db8-af84-44e3-97b4-221c36542ef4 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-5e004931-f1db-408c-9f7a-6c6c50c5f8ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:14:38 compute-0 nova_compute[117514]: 2025-10-08 19:14:38.739 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:39 compute-0 podman[150256]: 2025-10-08 19:14:39.644040398 +0000 UTC m=+0.055340028 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Oct 08 19:14:39 compute-0 podman[150254]: 2025-10-08 19:14:39.655226771 +0000 UTC m=+0.068412425 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Oct 08 19:14:39 compute-0 podman[150255]: 2025-10-08 19:14:39.696561183 +0000 UTC m=+0.109719696 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct 08 19:14:41 compute-0 nova_compute[117514]: 2025-10-08 19:14:41.309 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:41 compute-0 nova_compute[117514]: 2025-10-08 19:14:41.487 2 DEBUG oslo_concurrency.lockutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "2cd8a1e0-1eff-4f72-b839-340a50f3f21c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:14:41 compute-0 nova_compute[117514]: 2025-10-08 19:14:41.488 2 DEBUG oslo_concurrency.lockutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "2cd8a1e0-1eff-4f72-b839-340a50f3f21c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:14:41 compute-0 nova_compute[117514]: 2025-10-08 19:14:41.506 2 DEBUG nova.compute.manager [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Oct 08 19:14:41 compute-0 nova_compute[117514]: 2025-10-08 19:14:41.606 2 DEBUG oslo_concurrency.lockutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:14:41 compute-0 nova_compute[117514]: 2025-10-08 19:14:41.608 2 DEBUG oslo_concurrency.lockutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:14:41 compute-0 nova_compute[117514]: 2025-10-08 19:14:41.618 2 DEBUG nova.virt.hardware [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Oct 08 19:14:41 compute-0 nova_compute[117514]: 2025-10-08 19:14:41.618 2 INFO nova.compute.claims [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Claim successful on node compute-0.ctlplane.example.com
Oct 08 19:14:41 compute-0 nova_compute[117514]: 2025-10-08 19:14:41.732 2 DEBUG nova.compute.provider_tree [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 08 19:14:41 compute-0 nova_compute[117514]: 2025-10-08 19:14:41.747 2 DEBUG nova.scheduler.client.report [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 08 19:14:41 compute-0 nova_compute[117514]: 2025-10-08 19:14:41.768 2 DEBUG oslo_concurrency.lockutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.160s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:14:41 compute-0 nova_compute[117514]: 2025-10-08 19:14:41.769 2 DEBUG nova.compute.manager [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Oct 08 19:14:41 compute-0 nova_compute[117514]: 2025-10-08 19:14:41.817 2 DEBUG nova.compute.manager [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Oct 08 19:14:41 compute-0 nova_compute[117514]: 2025-10-08 19:14:41.818 2 DEBUG nova.network.neutron [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Oct 08 19:14:41 compute-0 nova_compute[117514]: 2025-10-08 19:14:41.838 2 INFO nova.virt.libvirt.driver [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 08 19:14:41 compute-0 nova_compute[117514]: 2025-10-08 19:14:41.864 2 DEBUG nova.compute.manager [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Oct 08 19:14:41 compute-0 nova_compute[117514]: 2025-10-08 19:14:41.956 2 DEBUG nova.compute.manager [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Oct 08 19:14:41 compute-0 nova_compute[117514]: 2025-10-08 19:14:41.958 2 DEBUG nova.virt.libvirt.driver [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Oct 08 19:14:41 compute-0 nova_compute[117514]: 2025-10-08 19:14:41.958 2 INFO nova.virt.libvirt.driver [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Creating image(s)
Oct 08 19:14:41 compute-0 nova_compute[117514]: 2025-10-08 19:14:41.959 2 DEBUG oslo_concurrency.lockutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "/var/lib/nova/instances/2cd8a1e0-1eff-4f72-b839-340a50f3f21c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:14:41 compute-0 nova_compute[117514]: 2025-10-08 19:14:41.960 2 DEBUG oslo_concurrency.lockutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "/var/lib/nova/instances/2cd8a1e0-1eff-4f72-b839-340a50f3f21c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:14:41 compute-0 nova_compute[117514]: 2025-10-08 19:14:41.961 2 DEBUG oslo_concurrency.lockutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "/var/lib/nova/instances/2cd8a1e0-1eff-4f72-b839-340a50f3f21c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:14:41 compute-0 nova_compute[117514]: 2025-10-08 19:14:41.984 2 DEBUG oslo_concurrency.processutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:14:42 compute-0 nova_compute[117514]: 2025-10-08 19:14:42.077 2 DEBUG oslo_concurrency.processutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:14:42 compute-0 nova_compute[117514]: 2025-10-08 19:14:42.078 2 DEBUG oslo_concurrency.lockutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "008eb3078b811ee47058b7252a820910c35fc6df" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:14:42 compute-0 nova_compute[117514]: 2025-10-08 19:14:42.079 2 DEBUG oslo_concurrency.lockutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:14:42 compute-0 nova_compute[117514]: 2025-10-08 19:14:42.112 2 DEBUG oslo_concurrency.processutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:14:42 compute-0 nova_compute[117514]: 2025-10-08 19:14:42.209 2 DEBUG oslo_concurrency.processutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:14:42 compute-0 nova_compute[117514]: 2025-10-08 19:14:42.210 2 DEBUG oslo_concurrency.processutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df,backing_fmt=raw /var/lib/nova/instances/2cd8a1e0-1eff-4f72-b839-340a50f3f21c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:14:42 compute-0 nova_compute[117514]: 2025-10-08 19:14:42.243 2 DEBUG nova.policy [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Oct 08 19:14:42 compute-0 nova_compute[117514]: 2025-10-08 19:14:42.263 2 DEBUG oslo_concurrency.processutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df,backing_fmt=raw /var/lib/nova/instances/2cd8a1e0-1eff-4f72-b839-340a50f3f21c/disk 1073741824" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:14:42 compute-0 nova_compute[117514]: 2025-10-08 19:14:42.264 2 DEBUG oslo_concurrency.lockutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.185s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:14:42 compute-0 nova_compute[117514]: 2025-10-08 19:14:42.265 2 DEBUG oslo_concurrency.processutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:14:42 compute-0 nova_compute[117514]: 2025-10-08 19:14:42.335 2 DEBUG oslo_concurrency.processutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:14:42 compute-0 nova_compute[117514]: 2025-10-08 19:14:42.336 2 DEBUG nova.virt.disk.api [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Checking if we can resize image /var/lib/nova/instances/2cd8a1e0-1eff-4f72-b839-340a50f3f21c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Oct 08 19:14:42 compute-0 nova_compute[117514]: 2025-10-08 19:14:42.337 2 DEBUG oslo_concurrency.processutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2cd8a1e0-1eff-4f72-b839-340a50f3f21c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:14:42 compute-0 nova_compute[117514]: 2025-10-08 19:14:42.403 2 DEBUG oslo_concurrency.processutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2cd8a1e0-1eff-4f72-b839-340a50f3f21c/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:14:42 compute-0 nova_compute[117514]: 2025-10-08 19:14:42.404 2 DEBUG nova.virt.disk.api [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Cannot resize image /var/lib/nova/instances/2cd8a1e0-1eff-4f72-b839-340a50f3f21c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Oct 08 19:14:42 compute-0 nova_compute[117514]: 2025-10-08 19:14:42.405 2 DEBUG nova.objects.instance [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'migration_context' on Instance uuid 2cd8a1e0-1eff-4f72-b839-340a50f3f21c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 08 19:14:42 compute-0 nova_compute[117514]: 2025-10-08 19:14:42.421 2 DEBUG nova.virt.libvirt.driver [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Oct 08 19:14:42 compute-0 nova_compute[117514]: 2025-10-08 19:14:42.422 2 DEBUG nova.virt.libvirt.driver [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Ensure instance console log exists: /var/lib/nova/instances/2cd8a1e0-1eff-4f72-b839-340a50f3f21c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Oct 08 19:14:42 compute-0 nova_compute[117514]: 2025-10-08 19:14:42.422 2 DEBUG oslo_concurrency.lockutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:14:42 compute-0 nova_compute[117514]: 2025-10-08 19:14:42.423 2 DEBUG oslo_concurrency.lockutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:14:42 compute-0 nova_compute[117514]: 2025-10-08 19:14:42.424 2 DEBUG oslo_concurrency.lockutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:14:42 compute-0 ovn_controller[19759]: 2025-10-08T19:14:42Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:23:50:87 10.100.0.4
Oct 08 19:14:42 compute-0 ovn_controller[19759]: 2025-10-08T19:14:42Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:23:50:87 10.100.0.4
Oct 08 19:14:42 compute-0 nova_compute[117514]: 2025-10-08 19:14:42.810 2 DEBUG nova.network.neutron [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Successfully created port: 2139e839-c698-494f-9fbc-5605baef1d1d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Oct 08 19:14:43 compute-0 nova_compute[117514]: 2025-10-08 19:14:43.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:43 compute-0 nova_compute[117514]: 2025-10-08 19:14:43.879 2 DEBUG nova.network.neutron [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Successfully updated port: 2139e839-c698-494f-9fbc-5605baef1d1d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Oct 08 19:14:43 compute-0 nova_compute[117514]: 2025-10-08 19:14:43.895 2 DEBUG oslo_concurrency.lockutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "refresh_cache-2cd8a1e0-1eff-4f72-b839-340a50f3f21c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:14:43 compute-0 nova_compute[117514]: 2025-10-08 19:14:43.896 2 DEBUG oslo_concurrency.lockutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquired lock "refresh_cache-2cd8a1e0-1eff-4f72-b839-340a50f3f21c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:14:43 compute-0 nova_compute[117514]: 2025-10-08 19:14:43.896 2 DEBUG nova.network.neutron [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 08 19:14:43 compute-0 nova_compute[117514]: 2025-10-08 19:14:43.976 2 DEBUG nova.compute.manager [req-a5cff1f8-d980-48ea-80eb-6186ce0832e7 req-7f55d453-1e72-4d52-8451-b897f4dff90a bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Received event network-changed-2139e839-c698-494f-9fbc-5605baef1d1d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:14:43 compute-0 nova_compute[117514]: 2025-10-08 19:14:43.976 2 DEBUG nova.compute.manager [req-a5cff1f8-d980-48ea-80eb-6186ce0832e7 req-7f55d453-1e72-4d52-8451-b897f4dff90a bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Refreshing instance network info cache due to event network-changed-2139e839-c698-494f-9fbc-5605baef1d1d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 08 19:14:43 compute-0 nova_compute[117514]: 2025-10-08 19:14:43.977 2 DEBUG oslo_concurrency.lockutils [req-a5cff1f8-d980-48ea-80eb-6186ce0832e7 req-7f55d453-1e72-4d52-8451-b897f4dff90a bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-2cd8a1e0-1eff-4f72-b839-340a50f3f21c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.191 2 DEBUG nova.network.neutron [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Oct 08 19:14:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:44.234 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:14:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:44.235 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:14:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:44.236 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.856 2 DEBUG nova.network.neutron [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Updating instance_info_cache with network_info: [{"id": "2139e839-c698-494f-9fbc-5605baef1d1d", "address": "fa:16:3e:22:30:5a", "network": {"id": "6826b0cb-7eaf-4468-bf17-e3c581bfc4ac", "bridge": "br-int", "label": "tempest-network-smoke--1582861562", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2139e839-c6", "ovs_interfaceid": "2139e839-c698-494f-9fbc-5605baef1d1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.881 2 DEBUG oslo_concurrency.lockutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Releasing lock "refresh_cache-2cd8a1e0-1eff-4f72-b839-340a50f3f21c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.881 2 DEBUG nova.compute.manager [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Instance network_info: |[{"id": "2139e839-c698-494f-9fbc-5605baef1d1d", "address": "fa:16:3e:22:30:5a", "network": {"id": "6826b0cb-7eaf-4468-bf17-e3c581bfc4ac", "bridge": "br-int", "label": "tempest-network-smoke--1582861562", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2139e839-c6", "ovs_interfaceid": "2139e839-c698-494f-9fbc-5605baef1d1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Oct 08 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.882 2 DEBUG oslo_concurrency.lockutils [req-a5cff1f8-d980-48ea-80eb-6186ce0832e7 req-7f55d453-1e72-4d52-8451-b897f4dff90a bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-2cd8a1e0-1eff-4f72-b839-340a50f3f21c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.882 2 DEBUG nova.network.neutron [req-a5cff1f8-d980-48ea-80eb-6186ce0832e7 req-7f55d453-1e72-4d52-8451-b897f4dff90a bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Refreshing network info cache for port 2139e839-c698-494f-9fbc-5605baef1d1d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 08 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.885 2 DEBUG nova.virt.libvirt.driver [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Start _get_guest_xml network_info=[{"id": "2139e839-c698-494f-9fbc-5605baef1d1d", "address": "fa:16:3e:22:30:5a", "network": {"id": "6826b0cb-7eaf-4468-bf17-e3c581bfc4ac", "bridge": "br-int", "label": "tempest-network-smoke--1582861562", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2139e839-c6", "ovs_interfaceid": "2139e839-c698-494f-9fbc-5605baef1d1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T19:05:11Z,direct_url=<?>,disk_format='qcow2',id=23cfa426-7011-4566-992d-1c7af39f70dd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0776a2a010754884a7b224f3b08ef53b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T19:05:12Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'guest_format': None, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_options': None, 'image_id': '23cfa426-7011-4566-992d-1c7af39f70dd'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Oct 08 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.890 2 WARNING nova.virt.libvirt.driver [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 08 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.895 2 DEBUG nova.virt.libvirt.host [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Oct 08 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.895 2 DEBUG nova.virt.libvirt.host [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Oct 08 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.899 2 DEBUG nova.virt.libvirt.host [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Oct 08 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.899 2 DEBUG nova.virt.libvirt.host [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Oct 08 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.900 2 DEBUG nova.virt.libvirt.driver [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Oct 08 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.900 2 DEBUG nova.virt.hardware [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T19:05:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='e8a148fc-4419-4813-98ff-a17e2a95609e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T19:05:11Z,direct_url=<?>,disk_format='qcow2',id=23cfa426-7011-4566-992d-1c7af39f70dd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0776a2a010754884a7b224f3b08ef53b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T19:05:12Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Oct 08 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.901 2 DEBUG nova.virt.hardware [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Oct 08 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.901 2 DEBUG nova.virt.hardware [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Oct 08 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.901 2 DEBUG nova.virt.hardware [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Oct 08 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.901 2 DEBUG nova.virt.hardware [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Oct 08 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.901 2 DEBUG nova.virt.hardware [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Oct 08 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.902 2 DEBUG nova.virt.hardware [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Oct 08 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.902 2 DEBUG nova.virt.hardware [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Oct 08 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.902 2 DEBUG nova.virt.hardware [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Oct 08 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.902 2 DEBUG nova.virt.hardware [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Oct 08 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.903 2 DEBUG nova.virt.hardware [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Oct 08 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.906 2 DEBUG nova.virt.libvirt.vif [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T19:14:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2104709800',display_name='tempest-TestNetworkBasicOps-server-2104709800',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2104709800',id=12,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD7KRl2SW48tLsDGtdUZXstQI0RJAgkIMeGypW4KhorPNM5dX0aheM9ROODmr544NnSbnVhZPkTpmB3kqR7fi9vzFVS1BaUwNIB2s1Cu3kNzwW4pHA+avxmDokcR+QqgSQ==',key_name='tempest-TestNetworkBasicOps-1494317570',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-vtc0uukp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T19:14:41Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=2cd8a1e0-1eff-4f72-b839-340a50f3f21c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2139e839-c698-494f-9fbc-5605baef1d1d", "address": "fa:16:3e:22:30:5a", "network": {"id": "6826b0cb-7eaf-4468-bf17-e3c581bfc4ac", "bridge": "br-int", "label": "tempest-network-smoke--1582861562", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2139e839-c6", "ovs_interfaceid": "2139e839-c698-494f-9fbc-5605baef1d1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Oct 08 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.906 2 DEBUG nova.network.os_vif_util [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "2139e839-c698-494f-9fbc-5605baef1d1d", "address": "fa:16:3e:22:30:5a", "network": {"id": "6826b0cb-7eaf-4468-bf17-e3c581bfc4ac", "bridge": "br-int", "label": "tempest-network-smoke--1582861562", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2139e839-c6", "ovs_interfaceid": "2139e839-c698-494f-9fbc-5605baef1d1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 08 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.907 2 DEBUG nova.network.os_vif_util [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:22:30:5a,bridge_name='br-int',has_traffic_filtering=True,id=2139e839-c698-494f-9fbc-5605baef1d1d,network=Network(6826b0cb-7eaf-4468-bf17-e3c581bfc4ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2139e839-c6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 08 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.908 2 DEBUG nova.objects.instance [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2cd8a1e0-1eff-4f72-b839-340a50f3f21c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 08 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.923 2 DEBUG nova.virt.libvirt.driver [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] End _get_guest_xml xml=<domain type="kvm">
Oct 08 19:14:44 compute-0 nova_compute[117514]:   <uuid>2cd8a1e0-1eff-4f72-b839-340a50f3f21c</uuid>
Oct 08 19:14:44 compute-0 nova_compute[117514]:   <name>instance-0000000c</name>
Oct 08 19:14:44 compute-0 nova_compute[117514]:   <memory>131072</memory>
Oct 08 19:14:44 compute-0 nova_compute[117514]:   <vcpu>1</vcpu>
Oct 08 19:14:44 compute-0 nova_compute[117514]:   <metadata>
Oct 08 19:14:44 compute-0 nova_compute[117514]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 08 19:14:44 compute-0 nova_compute[117514]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct 08 19:14:44 compute-0 nova_compute[117514]:       <nova:name>tempest-TestNetworkBasicOps-server-2104709800</nova:name>
Oct 08 19:14:44 compute-0 nova_compute[117514]:       <nova:creationTime>2025-10-08 19:14:44</nova:creationTime>
Oct 08 19:14:44 compute-0 nova_compute[117514]:       <nova:flavor name="m1.nano">
Oct 08 19:14:44 compute-0 nova_compute[117514]:         <nova:memory>128</nova:memory>
Oct 08 19:14:44 compute-0 nova_compute[117514]:         <nova:disk>1</nova:disk>
Oct 08 19:14:44 compute-0 nova_compute[117514]:         <nova:swap>0</nova:swap>
Oct 08 19:14:44 compute-0 nova_compute[117514]:         <nova:ephemeral>0</nova:ephemeral>
Oct 08 19:14:44 compute-0 nova_compute[117514]:         <nova:vcpus>1</nova:vcpus>
Oct 08 19:14:44 compute-0 nova_compute[117514]:       </nova:flavor>
Oct 08 19:14:44 compute-0 nova_compute[117514]:       <nova:owner>
Oct 08 19:14:44 compute-0 nova_compute[117514]:         <nova:user uuid="efdb1424acdb478684cdb088b373ba05">tempest-TestNetworkBasicOps-1122149477-project-member</nova:user>
Oct 08 19:14:44 compute-0 nova_compute[117514]:         <nova:project uuid="b7f7c752a9c5498f8eda73e461895ac9">tempest-TestNetworkBasicOps-1122149477</nova:project>
Oct 08 19:14:44 compute-0 nova_compute[117514]:       </nova:owner>
Oct 08 19:14:44 compute-0 nova_compute[117514]:       <nova:root type="image" uuid="23cfa426-7011-4566-992d-1c7af39f70dd"/>
Oct 08 19:14:44 compute-0 nova_compute[117514]:       <nova:ports>
Oct 08 19:14:44 compute-0 nova_compute[117514]:         <nova:port uuid="2139e839-c698-494f-9fbc-5605baef1d1d">
Oct 08 19:14:44 compute-0 nova_compute[117514]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct 08 19:14:44 compute-0 nova_compute[117514]:         </nova:port>
Oct 08 19:14:44 compute-0 nova_compute[117514]:       </nova:ports>
Oct 08 19:14:44 compute-0 nova_compute[117514]:     </nova:instance>
Oct 08 19:14:44 compute-0 nova_compute[117514]:   </metadata>
Oct 08 19:14:44 compute-0 nova_compute[117514]:   <sysinfo type="smbios">
Oct 08 19:14:44 compute-0 nova_compute[117514]:     <system>
Oct 08 19:14:44 compute-0 nova_compute[117514]:       <entry name="manufacturer">RDO</entry>
Oct 08 19:14:44 compute-0 nova_compute[117514]:       <entry name="product">OpenStack Compute</entry>
Oct 08 19:14:44 compute-0 nova_compute[117514]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct 08 19:14:44 compute-0 nova_compute[117514]:       <entry name="serial">2cd8a1e0-1eff-4f72-b839-340a50f3f21c</entry>
Oct 08 19:14:44 compute-0 nova_compute[117514]:       <entry name="uuid">2cd8a1e0-1eff-4f72-b839-340a50f3f21c</entry>
Oct 08 19:14:44 compute-0 nova_compute[117514]:       <entry name="family">Virtual Machine</entry>
Oct 08 19:14:44 compute-0 nova_compute[117514]:     </system>
Oct 08 19:14:44 compute-0 nova_compute[117514]:   </sysinfo>
Oct 08 19:14:44 compute-0 nova_compute[117514]:   <os>
Oct 08 19:14:44 compute-0 nova_compute[117514]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 08 19:14:44 compute-0 nova_compute[117514]:     <boot dev="hd"/>
Oct 08 19:14:44 compute-0 nova_compute[117514]:     <smbios mode="sysinfo"/>
Oct 08 19:14:44 compute-0 nova_compute[117514]:   </os>
Oct 08 19:14:44 compute-0 nova_compute[117514]:   <features>
Oct 08 19:14:44 compute-0 nova_compute[117514]:     <acpi/>
Oct 08 19:14:44 compute-0 nova_compute[117514]:     <apic/>
Oct 08 19:14:44 compute-0 nova_compute[117514]:     <vmcoreinfo/>
Oct 08 19:14:44 compute-0 nova_compute[117514]:   </features>
Oct 08 19:14:44 compute-0 nova_compute[117514]:   <clock offset="utc">
Oct 08 19:14:44 compute-0 nova_compute[117514]:     <timer name="pit" tickpolicy="delay"/>
Oct 08 19:14:44 compute-0 nova_compute[117514]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 08 19:14:44 compute-0 nova_compute[117514]:     <timer name="hpet" present="no"/>
Oct 08 19:14:44 compute-0 nova_compute[117514]:   </clock>
Oct 08 19:14:44 compute-0 nova_compute[117514]:   <cpu mode="host-model" match="exact">
Oct 08 19:14:44 compute-0 nova_compute[117514]:     <topology sockets="1" cores="1" threads="1"/>
Oct 08 19:14:44 compute-0 nova_compute[117514]:   </cpu>
Oct 08 19:14:44 compute-0 nova_compute[117514]:   <devices>
Oct 08 19:14:44 compute-0 nova_compute[117514]:     <disk type="file" device="disk">
Oct 08 19:14:44 compute-0 nova_compute[117514]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 08 19:14:44 compute-0 nova_compute[117514]:       <source file="/var/lib/nova/instances/2cd8a1e0-1eff-4f72-b839-340a50f3f21c/disk"/>
Oct 08 19:14:44 compute-0 nova_compute[117514]:       <target dev="vda" bus="virtio"/>
Oct 08 19:14:44 compute-0 nova_compute[117514]:     </disk>
Oct 08 19:14:44 compute-0 nova_compute[117514]:     <disk type="file" device="cdrom">
Oct 08 19:14:44 compute-0 nova_compute[117514]:       <driver name="qemu" type="raw" cache="none"/>
Oct 08 19:14:44 compute-0 nova_compute[117514]:       <source file="/var/lib/nova/instances/2cd8a1e0-1eff-4f72-b839-340a50f3f21c/disk.config"/>
Oct 08 19:14:44 compute-0 nova_compute[117514]:       <target dev="sda" bus="sata"/>
Oct 08 19:14:44 compute-0 nova_compute[117514]:     </disk>
Oct 08 19:14:44 compute-0 nova_compute[117514]:     <interface type="ethernet">
Oct 08 19:14:44 compute-0 nova_compute[117514]:       <mac address="fa:16:3e:22:30:5a"/>
Oct 08 19:14:44 compute-0 nova_compute[117514]:       <model type="virtio"/>
Oct 08 19:14:44 compute-0 nova_compute[117514]:       <driver name="vhost" rx_queue_size="512"/>
Oct 08 19:14:44 compute-0 nova_compute[117514]:       <mtu size="1442"/>
Oct 08 19:14:44 compute-0 nova_compute[117514]:       <target dev="tap2139e839-c6"/>
Oct 08 19:14:44 compute-0 nova_compute[117514]:     </interface>
Oct 08 19:14:44 compute-0 nova_compute[117514]:     <serial type="pty">
Oct 08 19:14:44 compute-0 nova_compute[117514]:       <log file="/var/lib/nova/instances/2cd8a1e0-1eff-4f72-b839-340a50f3f21c/console.log" append="off"/>
Oct 08 19:14:44 compute-0 nova_compute[117514]:     </serial>
Oct 08 19:14:44 compute-0 nova_compute[117514]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 08 19:14:44 compute-0 nova_compute[117514]:     <video>
Oct 08 19:14:44 compute-0 nova_compute[117514]:       <model type="virtio"/>
Oct 08 19:14:44 compute-0 nova_compute[117514]:     </video>
Oct 08 19:14:44 compute-0 nova_compute[117514]:     <input type="tablet" bus="usb"/>
Oct 08 19:14:44 compute-0 nova_compute[117514]:     <rng model="virtio">
Oct 08 19:14:44 compute-0 nova_compute[117514]:       <backend model="random">/dev/urandom</backend>
Oct 08 19:14:44 compute-0 nova_compute[117514]:     </rng>
Oct 08 19:14:44 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root"/>
Oct 08 19:14:44 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:14:44 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:14:44 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:14:44 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:14:44 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:14:44 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:14:44 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:14:44 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:14:44 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:14:44 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:14:44 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:14:44 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:14:44 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:14:44 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:14:44 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:14:44 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:14:44 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:14:44 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:14:44 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:14:44 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:14:44 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:14:44 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:14:44 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:14:44 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:14:44 compute-0 nova_compute[117514]:     <controller type="usb" index="0"/>
Oct 08 19:14:44 compute-0 nova_compute[117514]:     <memballoon model="virtio">
Oct 08 19:14:44 compute-0 nova_compute[117514]:       <stats period="10"/>
Oct 08 19:14:44 compute-0 nova_compute[117514]:     </memballoon>
Oct 08 19:14:44 compute-0 nova_compute[117514]:   </devices>
Oct 08 19:14:44 compute-0 nova_compute[117514]: </domain>
Oct 08 19:14:44 compute-0 nova_compute[117514]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Oct 08 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.924 2 DEBUG nova.compute.manager [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Preparing to wait for external event network-vif-plugged-2139e839-c698-494f-9fbc-5605baef1d1d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Oct 08 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.924 2 DEBUG oslo_concurrency.lockutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "2cd8a1e0-1eff-4f72-b839-340a50f3f21c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.924 2 DEBUG oslo_concurrency.lockutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "2cd8a1e0-1eff-4f72-b839-340a50f3f21c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.924 2 DEBUG oslo_concurrency.lockutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "2cd8a1e0-1eff-4f72-b839-340a50f3f21c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.925 2 DEBUG nova.virt.libvirt.vif [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T19:14:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2104709800',display_name='tempest-TestNetworkBasicOps-server-2104709800',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2104709800',id=12,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD7KRl2SW48tLsDGtdUZXstQI0RJAgkIMeGypW4KhorPNM5dX0aheM9ROODmr544NnSbnVhZPkTpmB3kqR7fi9vzFVS1BaUwNIB2s1Cu3kNzwW4pHA+avxmDokcR+QqgSQ==',key_name='tempest-TestNetworkBasicOps-1494317570',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-vtc0uukp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T19:14:41Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=2cd8a1e0-1eff-4f72-b839-340a50f3f21c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2139e839-c698-494f-9fbc-5605baef1d1d", "address": "fa:16:3e:22:30:5a", "network": {"id": "6826b0cb-7eaf-4468-bf17-e3c581bfc4ac", "bridge": "br-int", "label": "tempest-network-smoke--1582861562", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2139e839-c6", "ovs_interfaceid": "2139e839-c698-494f-9fbc-5605baef1d1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Oct 08 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.925 2 DEBUG nova.network.os_vif_util [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "2139e839-c698-494f-9fbc-5605baef1d1d", "address": "fa:16:3e:22:30:5a", "network": {"id": "6826b0cb-7eaf-4468-bf17-e3c581bfc4ac", "bridge": "br-int", "label": "tempest-network-smoke--1582861562", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2139e839-c6", "ovs_interfaceid": "2139e839-c698-494f-9fbc-5605baef1d1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 08 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.926 2 DEBUG nova.network.os_vif_util [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:22:30:5a,bridge_name='br-int',has_traffic_filtering=True,id=2139e839-c698-494f-9fbc-5605baef1d1d,network=Network(6826b0cb-7eaf-4468-bf17-e3c581bfc4ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2139e839-c6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 08 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.926 2 DEBUG os_vif [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:30:5a,bridge_name='br-int',has_traffic_filtering=True,id=2139e839-c698-494f-9fbc-5605baef1d1d,network=Network(6826b0cb-7eaf-4468-bf17-e3c581bfc4ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2139e839-c6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Oct 08 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.927 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.927 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.927 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 08 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.929 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.929 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2139e839-c6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.929 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2139e839-c6, col_values=(('external_ids', {'iface-id': '2139e839-c698-494f-9fbc-5605baef1d1d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:22:30:5a', 'vm-uuid': '2cd8a1e0-1eff-4f72-b839-340a50f3f21c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.931 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:44 compute-0 NetworkManager[1035]: <info>  [1759950884.9327] manager: (tap2139e839-c6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/87)
Oct 08 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.933 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 08 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.939 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.939 2 INFO os_vif [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:30:5a,bridge_name='br-int',has_traffic_filtering=True,id=2139e839-c698-494f-9fbc-5605baef1d1d,network=Network(6826b0cb-7eaf-4468-bf17-e3c581bfc4ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2139e839-c6')
Oct 08 19:14:45 compute-0 nova_compute[117514]: 2025-10-08 19:14:45.014 2 DEBUG nova.virt.libvirt.driver [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 08 19:14:45 compute-0 nova_compute[117514]: 2025-10-08 19:14:45.015 2 DEBUG nova.virt.libvirt.driver [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 08 19:14:45 compute-0 nova_compute[117514]: 2025-10-08 19:14:45.016 2 DEBUG nova.virt.libvirt.driver [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No VIF found with MAC fa:16:3e:22:30:5a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Oct 08 19:14:45 compute-0 nova_compute[117514]: 2025-10-08 19:14:45.016 2 INFO nova.virt.libvirt.driver [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Using config drive
Oct 08 19:14:45 compute-0 nova_compute[117514]: 2025-10-08 19:14:45.358 2 INFO nova.virt.libvirt.driver [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Creating config drive at /var/lib/nova/instances/2cd8a1e0-1eff-4f72-b839-340a50f3f21c/disk.config
Oct 08 19:14:45 compute-0 nova_compute[117514]: 2025-10-08 19:14:45.363 2 DEBUG oslo_concurrency.processutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2cd8a1e0-1eff-4f72-b839-340a50f3f21c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpl8utjb96 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:14:45 compute-0 nova_compute[117514]: 2025-10-08 19:14:45.505 2 DEBUG oslo_concurrency.processutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2cd8a1e0-1eff-4f72-b839-340a50f3f21c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpl8utjb96" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:14:45 compute-0 kernel: tap2139e839-c6: entered promiscuous mode
Oct 08 19:14:45 compute-0 NetworkManager[1035]: <info>  [1759950885.5746] manager: (tap2139e839-c6): new Tun device (/org/freedesktop/NetworkManager/Devices/88)
Oct 08 19:14:45 compute-0 ovn_controller[19759]: 2025-10-08T19:14:45Z|00144|binding|INFO|Claiming lport 2139e839-c698-494f-9fbc-5605baef1d1d for this chassis.
Oct 08 19:14:45 compute-0 ovn_controller[19759]: 2025-10-08T19:14:45Z|00145|binding|INFO|2139e839-c698-494f-9fbc-5605baef1d1d: Claiming fa:16:3e:22:30:5a 10.100.0.6
Oct 08 19:14:45 compute-0 nova_compute[117514]: 2025-10-08 19:14:45.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:45 compute-0 nova_compute[117514]: 2025-10-08 19:14:45.581 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:45.586 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:22:30:5a 10.100.0.6'], port_security=['fa:16:3e:22:30:5a 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '2cd8a1e0-1eff-4f72-b839-340a50f3f21c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9bd895d2-82c4-4fc5-81d5-e70c0a9516c6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=770536b4-68ae-4751-9b56-96d89b6bc561, chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=2139e839-c698-494f-9fbc-5605baef1d1d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 08 19:14:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:45.588 28643 INFO neutron.agent.ovn.metadata.agent [-] Port 2139e839-c698-494f-9fbc-5605baef1d1d in datapath 6826b0cb-7eaf-4468-bf17-e3c581bfc4ac bound to our chassis
Oct 08 19:14:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:45.589 28643 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6826b0cb-7eaf-4468-bf17-e3c581bfc4ac
Oct 08 19:14:45 compute-0 ovn_controller[19759]: 2025-10-08T19:14:45Z|00146|binding|INFO|Setting lport 2139e839-c698-494f-9fbc-5605baef1d1d ovn-installed in OVS
Oct 08 19:14:45 compute-0 ovn_controller[19759]: 2025-10-08T19:14:45Z|00147|binding|INFO|Setting lport 2139e839-c698-494f-9fbc-5605baef1d1d up in Southbound
Oct 08 19:14:45 compute-0 nova_compute[117514]: 2025-10-08 19:14:45.592 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:45 compute-0 nova_compute[117514]: 2025-10-08 19:14:45.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:45 compute-0 systemd-udevd[150361]: Network interface NamePolicy= disabled on kernel command line.
Oct 08 19:14:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:45.609 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[70481a22-acd0-46fa-a39f-db00df185299]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:14:45 compute-0 systemd-machined[77568]: New machine qemu-12-instance-0000000c.
Oct 08 19:14:45 compute-0 NetworkManager[1035]: <info>  [1759950885.6233] device (tap2139e839-c6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 08 19:14:45 compute-0 NetworkManager[1035]: <info>  [1759950885.6243] device (tap2139e839-c6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 08 19:14:45 compute-0 systemd[1]: Started Virtual Machine qemu-12-instance-0000000c.
Oct 08 19:14:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:45.642 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[fd2405d4-577c-40e0-9183-b64d5f8f3ad5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:14:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:45.646 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[376cb553-ccc9-4e12-b8a2-ca82faad77ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:14:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:45.674 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[9e7c5f8f-3e13-4866-a9e8-4b3f7fd179d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:14:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:45.697 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[17a256db-7ef0-42ec-8892-e09adeaefef8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6826b0cb-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:25:04:2d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 832, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 832, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 50], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 154024, 'reachable_time': 41558, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 150372, 'error': None, 'target': 'ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:14:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:45.721 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[1b73410e-8f5c-4a59-a9b9-69ddff97413f]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap6826b0cb-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 154039, 'tstamp': 154039}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 150375, 'error': None, 'target': 'ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap6826b0cb-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 154043, 'tstamp': 154043}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 150375, 'error': None, 'target': 'ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:14:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:45.723 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6826b0cb-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:14:45 compute-0 nova_compute[117514]: 2025-10-08 19:14:45.724 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:45.726 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6826b0cb-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:14:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:45.726 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 08 19:14:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:45.727 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6826b0cb-70, col_values=(('external_ids', {'iface-id': 'eabc4672-d176-4f11-b5f6-bcbea840c3e8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:14:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:45.727 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 08 19:14:45 compute-0 nova_compute[117514]: 2025-10-08 19:14:45.764 2 DEBUG nova.compute.manager [req-69e4a6b9-494e-4bac-82e5-a2e121a5386f req-2d30ef81-3f22-4512-b0bd-dae1e7f41322 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Received event network-vif-plugged-2139e839-c698-494f-9fbc-5605baef1d1d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:14:45 compute-0 nova_compute[117514]: 2025-10-08 19:14:45.765 2 DEBUG oslo_concurrency.lockutils [req-69e4a6b9-494e-4bac-82e5-a2e121a5386f req-2d30ef81-3f22-4512-b0bd-dae1e7f41322 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "2cd8a1e0-1eff-4f72-b839-340a50f3f21c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:14:45 compute-0 nova_compute[117514]: 2025-10-08 19:14:45.769 2 DEBUG oslo_concurrency.lockutils [req-69e4a6b9-494e-4bac-82e5-a2e121a5386f req-2d30ef81-3f22-4512-b0bd-dae1e7f41322 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "2cd8a1e0-1eff-4f72-b839-340a50f3f21c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:14:45 compute-0 nova_compute[117514]: 2025-10-08 19:14:45.769 2 DEBUG oslo_concurrency.lockutils [req-69e4a6b9-494e-4bac-82e5-a2e121a5386f req-2d30ef81-3f22-4512-b0bd-dae1e7f41322 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "2cd8a1e0-1eff-4f72-b839-340a50f3f21c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:14:45 compute-0 nova_compute[117514]: 2025-10-08 19:14:45.770 2 DEBUG nova.compute.manager [req-69e4a6b9-494e-4bac-82e5-a2e121a5386f req-2d30ef81-3f22-4512-b0bd-dae1e7f41322 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Processing event network-vif-plugged-2139e839-c698-494f-9fbc-5605baef1d1d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Oct 08 19:14:45 compute-0 nova_compute[117514]: 2025-10-08 19:14:45.972 2 DEBUG nova.network.neutron [req-a5cff1f8-d980-48ea-80eb-6186ce0832e7 req-7f55d453-1e72-4d52-8451-b897f4dff90a bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Updated VIF entry in instance network info cache for port 2139e839-c698-494f-9fbc-5605baef1d1d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 08 19:14:45 compute-0 nova_compute[117514]: 2025-10-08 19:14:45.972 2 DEBUG nova.network.neutron [req-a5cff1f8-d980-48ea-80eb-6186ce0832e7 req-7f55d453-1e72-4d52-8451-b897f4dff90a bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Updating instance_info_cache with network_info: [{"id": "2139e839-c698-494f-9fbc-5605baef1d1d", "address": "fa:16:3e:22:30:5a", "network": {"id": "6826b0cb-7eaf-4468-bf17-e3c581bfc4ac", "bridge": "br-int", "label": "tempest-network-smoke--1582861562", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2139e839-c6", "ovs_interfaceid": "2139e839-c698-494f-9fbc-5605baef1d1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:14:45 compute-0 nova_compute[117514]: 2025-10-08 19:14:45.991 2 DEBUG oslo_concurrency.lockutils [req-a5cff1f8-d980-48ea-80eb-6186ce0832e7 req-7f55d453-1e72-4d52-8451-b897f4dff90a bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-2cd8a1e0-1eff-4f72-b839-340a50f3f21c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.311 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.555 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950886.554685, 2cd8a1e0-1eff-4f72-b839-340a50f3f21c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 08 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.555 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] VM Started (Lifecycle Event)
Oct 08 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.557 2 DEBUG nova.compute.manager [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Oct 08 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.560 2 DEBUG nova.virt.libvirt.driver [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Oct 08 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.563 2 INFO nova.virt.libvirt.driver [-] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Instance spawned successfully.
Oct 08 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.564 2 DEBUG nova.virt.libvirt.driver [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Oct 08 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.580 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.586 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 08 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.590 2 DEBUG nova.virt.libvirt.driver [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.591 2 DEBUG nova.virt.libvirt.driver [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.591 2 DEBUG nova.virt.libvirt.driver [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.592 2 DEBUG nova.virt.libvirt.driver [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.592 2 DEBUG nova.virt.libvirt.driver [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.593 2 DEBUG nova.virt.libvirt.driver [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.618 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 08 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.619 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950886.5553448, 2cd8a1e0-1eff-4f72-b839-340a50f3f21c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 08 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.619 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] VM Paused (Lifecycle Event)
Oct 08 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.642 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.646 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950886.5599563, 2cd8a1e0-1eff-4f72-b839-340a50f3f21c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 08 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.646 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] VM Resumed (Lifecycle Event)
Oct 08 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.650 2 INFO nova.compute.manager [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Took 4.69 seconds to spawn the instance on the hypervisor.
Oct 08 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.650 2 DEBUG nova.compute.manager [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.663 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.667 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 08 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.695 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 08 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.713 2 INFO nova.compute.manager [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Took 5.15 seconds to build instance.
Oct 08 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.728 2 DEBUG oslo_concurrency.lockutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "2cd8a1e0-1eff-4f72-b839-340a50f3f21c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.241s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:14:47 compute-0 nova_compute[117514]: 2025-10-08 19:14:47.844 2 DEBUG nova.compute.manager [req-9aa6218b-debe-40b6-85d1-98bffc582343 req-83c6c786-0aba-4b25-8d15-8815e381e64e bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Received event network-vif-plugged-2139e839-c698-494f-9fbc-5605baef1d1d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:14:47 compute-0 nova_compute[117514]: 2025-10-08 19:14:47.845 2 DEBUG oslo_concurrency.lockutils [req-9aa6218b-debe-40b6-85d1-98bffc582343 req-83c6c786-0aba-4b25-8d15-8815e381e64e bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "2cd8a1e0-1eff-4f72-b839-340a50f3f21c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:14:47 compute-0 nova_compute[117514]: 2025-10-08 19:14:47.845 2 DEBUG oslo_concurrency.lockutils [req-9aa6218b-debe-40b6-85d1-98bffc582343 req-83c6c786-0aba-4b25-8d15-8815e381e64e bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "2cd8a1e0-1eff-4f72-b839-340a50f3f21c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:14:47 compute-0 nova_compute[117514]: 2025-10-08 19:14:47.846 2 DEBUG oslo_concurrency.lockutils [req-9aa6218b-debe-40b6-85d1-98bffc582343 req-83c6c786-0aba-4b25-8d15-8815e381e64e bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "2cd8a1e0-1eff-4f72-b839-340a50f3f21c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:14:47 compute-0 nova_compute[117514]: 2025-10-08 19:14:47.846 2 DEBUG nova.compute.manager [req-9aa6218b-debe-40b6-85d1-98bffc582343 req-83c6c786-0aba-4b25-8d15-8815e381e64e bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] No waiting events found dispatching network-vif-plugged-2139e839-c698-494f-9fbc-5605baef1d1d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 08 19:14:47 compute-0 nova_compute[117514]: 2025-10-08 19:14:47.847 2 WARNING nova.compute.manager [req-9aa6218b-debe-40b6-85d1-98bffc582343 req-83c6c786-0aba-4b25-8d15-8815e381e64e bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Received unexpected event network-vif-plugged-2139e839-c698-494f-9fbc-5605baef1d1d for instance with vm_state active and task_state None.
Oct 08 19:14:49 compute-0 nova_compute[117514]: 2025-10-08 19:14:49.931 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:50 compute-0 podman[150384]: 2025-10-08 19:14:50.672219968 +0000 UTC m=+0.082401759 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 08 19:14:51 compute-0 nova_compute[117514]: 2025-10-08 19:14:51.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:54 compute-0 nova_compute[117514]: 2025-10-08 19:14:54.283 2 DEBUG nova.compute.manager [req-127840b5-84fb-40f7-9219-e6c1e1f035b0 req-6ccc32d5-16cf-48fa-96e7-ef506b9537de bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Received event network-changed-2139e839-c698-494f-9fbc-5605baef1d1d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:14:54 compute-0 nova_compute[117514]: 2025-10-08 19:14:54.283 2 DEBUG nova.compute.manager [req-127840b5-84fb-40f7-9219-e6c1e1f035b0 req-6ccc32d5-16cf-48fa-96e7-ef506b9537de bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Refreshing instance network info cache due to event network-changed-2139e839-c698-494f-9fbc-5605baef1d1d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 08 19:14:54 compute-0 nova_compute[117514]: 2025-10-08 19:14:54.284 2 DEBUG oslo_concurrency.lockutils [req-127840b5-84fb-40f7-9219-e6c1e1f035b0 req-6ccc32d5-16cf-48fa-96e7-ef506b9537de bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-2cd8a1e0-1eff-4f72-b839-340a50f3f21c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:14:54 compute-0 nova_compute[117514]: 2025-10-08 19:14:54.284 2 DEBUG oslo_concurrency.lockutils [req-127840b5-84fb-40f7-9219-e6c1e1f035b0 req-6ccc32d5-16cf-48fa-96e7-ef506b9537de bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-2cd8a1e0-1eff-4f72-b839-340a50f3f21c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:14:54 compute-0 nova_compute[117514]: 2025-10-08 19:14:54.285 2 DEBUG nova.network.neutron [req-127840b5-84fb-40f7-9219-e6c1e1f035b0 req-6ccc32d5-16cf-48fa-96e7-ef506b9537de bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Refreshing network info cache for port 2139e839-c698-494f-9fbc-5605baef1d1d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 08 19:14:54 compute-0 nova_compute[117514]: 2025-10-08 19:14:54.933 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:56 compute-0 nova_compute[117514]: 2025-10-08 19:14:56.315 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:14:56 compute-0 nova_compute[117514]: 2025-10-08 19:14:56.525 2 DEBUG nova.network.neutron [req-127840b5-84fb-40f7-9219-e6c1e1f035b0 req-6ccc32d5-16cf-48fa-96e7-ef506b9537de bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Updated VIF entry in instance network info cache for port 2139e839-c698-494f-9fbc-5605baef1d1d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 08 19:14:56 compute-0 nova_compute[117514]: 2025-10-08 19:14:56.526 2 DEBUG nova.network.neutron [req-127840b5-84fb-40f7-9219-e6c1e1f035b0 req-6ccc32d5-16cf-48fa-96e7-ef506b9537de bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Updating instance_info_cache with network_info: [{"id": "2139e839-c698-494f-9fbc-5605baef1d1d", "address": "fa:16:3e:22:30:5a", "network": {"id": "6826b0cb-7eaf-4468-bf17-e3c581bfc4ac", "bridge": "br-int", "label": "tempest-network-smoke--1582861562", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.198", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2139e839-c6", "ovs_interfaceid": "2139e839-c698-494f-9fbc-5605baef1d1d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:14:56 compute-0 nova_compute[117514]: 2025-10-08 19:14:56.549 2 DEBUG oslo_concurrency.lockutils [req-127840b5-84fb-40f7-9219-e6c1e1f035b0 req-6ccc32d5-16cf-48fa-96e7-ef506b9537de bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-2cd8a1e0-1eff-4f72-b839-340a50f3f21c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:14:58 compute-0 ovn_controller[19759]: 2025-10-08T19:14:58Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:22:30:5a 10.100.0.6
Oct 08 19:14:58 compute-0 ovn_controller[19759]: 2025-10-08T19:14:58Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:22:30:5a 10.100.0.6
Oct 08 19:14:59 compute-0 nova_compute[117514]: 2025-10-08 19:14:59.936 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:00 compute-0 podman[150424]: 2025-10-08 19:15:00.646970216 +0000 UTC m=+0.064794601 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct 08 19:15:01 compute-0 nova_compute[117514]: 2025-10-08 19:15:01.318 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:04 compute-0 nova_compute[117514]: 2025-10-08 19:15:04.939 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:06 compute-0 nova_compute[117514]: 2025-10-08 19:15:06.321 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:06 compute-0 podman[150446]: 2025-10-08 19:15:06.660704661 +0000 UTC m=+0.068640230 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 08 19:15:06 compute-0 podman[150445]: 2025-10-08 19:15:06.661845744 +0000 UTC m=+0.081528221 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct 08 19:15:06 compute-0 podman[150444]: 2025-10-08 19:15:06.670197675 +0000 UTC m=+0.089265405 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, name=ubi9-minimal, vcs-type=git, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., release=1755695350, version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct 08 19:15:07 compute-0 nova_compute[117514]: 2025-10-08 19:15:07.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:15:09 compute-0 nova_compute[117514]: 2025-10-08 19:15:09.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:15:09 compute-0 nova_compute[117514]: 2025-10-08 19:15:09.752 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:15:09 compute-0 nova_compute[117514]: 2025-10-08 19:15:09.753 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:15:09 compute-0 nova_compute[117514]: 2025-10-08 19:15:09.753 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:15:09 compute-0 nova_compute[117514]: 2025-10-08 19:15:09.753 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 08 19:15:09 compute-0 nova_compute[117514]: 2025-10-08 19:15:09.837 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5e004931-f1db-408c-9f7a-6c6c50c5f8ef/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:15:09 compute-0 podman[150512]: 2025-10-08 19:15:09.892189396 +0000 UTC m=+0.076619891 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_managed=true, container_name=iscsid, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct 08 19:15:09 compute-0 nova_compute[117514]: 2025-10-08 19:15:09.914 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5e004931-f1db-408c-9f7a-6c6c50c5f8ef/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:15:09 compute-0 nova_compute[117514]: 2025-10-08 19:15:09.915 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5e004931-f1db-408c-9f7a-6c6c50c5f8ef/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:15:09 compute-0 podman[150516]: 2025-10-08 19:15:09.920642496 +0000 UTC m=+0.083319943 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Oct 08 19:15:09 compute-0 nova_compute[117514]: 2025-10-08 19:15:09.941 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:09 compute-0 podman[150513]: 2025-10-08 19:15:09.947024197 +0000 UTC m=+0.124953564 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct 08 19:15:09 compute-0 nova_compute[117514]: 2025-10-08 19:15:09.968 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5e004931-f1db-408c-9f7a-6c6c50c5f8ef/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:15:09 compute-0 nova_compute[117514]: 2025-10-08 19:15:09.974 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2cd8a1e0-1eff-4f72-b839-340a50f3f21c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:15:10 compute-0 nova_compute[117514]: 2025-10-08 19:15:10.038 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2cd8a1e0-1eff-4f72-b839-340a50f3f21c/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:15:10 compute-0 nova_compute[117514]: 2025-10-08 19:15:10.039 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2cd8a1e0-1eff-4f72-b839-340a50f3f21c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:15:10 compute-0 nova_compute[117514]: 2025-10-08 19:15:10.108 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2cd8a1e0-1eff-4f72-b839-340a50f3f21c/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:15:10 compute-0 nova_compute[117514]: 2025-10-08 19:15:10.343 2 INFO nova.compute.manager [None req-e9073bc5-124f-4ca9-b785-d7b35abced05 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Get console output
Oct 08 19:15:10 compute-0 nova_compute[117514]: 2025-10-08 19:15:10.349 54 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Oct 08 19:15:10 compute-0 nova_compute[117514]: 2025-10-08 19:15:10.359 2 WARNING nova.virt.libvirt.driver [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 08 19:15:10 compute-0 nova_compute[117514]: 2025-10-08 19:15:10.360 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5786MB free_disk=73.35597610473633GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 08 19:15:10 compute-0 nova_compute[117514]: 2025-10-08 19:15:10.360 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:15:10 compute-0 nova_compute[117514]: 2025-10-08 19:15:10.360 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:15:10 compute-0 nova_compute[117514]: 2025-10-08 19:15:10.446 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Instance 5e004931-f1db-408c-9f7a-6c6c50c5f8ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 08 19:15:10 compute-0 nova_compute[117514]: 2025-10-08 19:15:10.447 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Instance 2cd8a1e0-1eff-4f72-b839-340a50f3f21c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 08 19:15:10 compute-0 nova_compute[117514]: 2025-10-08 19:15:10.447 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 08 19:15:10 compute-0 nova_compute[117514]: 2025-10-08 19:15:10.448 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 08 19:15:10 compute-0 nova_compute[117514]: 2025-10-08 19:15:10.515 2 DEBUG nova.compute.provider_tree [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 08 19:15:10 compute-0 nova_compute[117514]: 2025-10-08 19:15:10.529 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 08 19:15:10 compute-0 nova_compute[117514]: 2025-10-08 19:15:10.559 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 08 19:15:10 compute-0 nova_compute[117514]: 2025-10-08 19:15:10.559 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.199s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:15:11 compute-0 nova_compute[117514]: 2025-10-08 19:15:11.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:11 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:11.431 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a6:75:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '5e:14:dd:63:55:2a'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 08 19:15:11 compute-0 nova_compute[117514]: 2025-10-08 19:15:11.431 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:11 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:11.432 28643 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 08 19:15:11 compute-0 nova_compute[117514]: 2025-10-08 19:15:11.555 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:15:11 compute-0 nova_compute[117514]: 2025-10-08 19:15:11.556 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:15:11 compute-0 nova_compute[117514]: 2025-10-08 19:15:11.556 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 08 19:15:11 compute-0 nova_compute[117514]: 2025-10-08 19:15:11.585 2 DEBUG nova.compute.manager [req-54561677-c8aa-44ec-ae59-06183821da70 req-b0f08bc3-b63e-49e3-aa39-babd3007c281 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Received event network-changed-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:15:11 compute-0 nova_compute[117514]: 2025-10-08 19:15:11.585 2 DEBUG nova.compute.manager [req-54561677-c8aa-44ec-ae59-06183821da70 req-b0f08bc3-b63e-49e3-aa39-babd3007c281 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Refreshing instance network info cache due to event network-changed-ae9e7968-10b0-4606-9fa3-c91374cf1cc1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 08 19:15:11 compute-0 nova_compute[117514]: 2025-10-08 19:15:11.586 2 DEBUG oslo_concurrency.lockutils [req-54561677-c8aa-44ec-ae59-06183821da70 req-b0f08bc3-b63e-49e3-aa39-babd3007c281 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-5e004931-f1db-408c-9f7a-6c6c50c5f8ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:15:11 compute-0 nova_compute[117514]: 2025-10-08 19:15:11.586 2 DEBUG oslo_concurrency.lockutils [req-54561677-c8aa-44ec-ae59-06183821da70 req-b0f08bc3-b63e-49e3-aa39-babd3007c281 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-5e004931-f1db-408c-9f7a-6c6c50c5f8ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:15:11 compute-0 nova_compute[117514]: 2025-10-08 19:15:11.587 2 DEBUG nova.network.neutron [req-54561677-c8aa-44ec-ae59-06183821da70 req-b0f08bc3-b63e-49e3-aa39-babd3007c281 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Refreshing network info cache for port ae9e7968-10b0-4606-9fa3-c91374cf1cc1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 08 19:15:11 compute-0 nova_compute[117514]: 2025-10-08 19:15:11.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:15:11 compute-0 nova_compute[117514]: 2025-10-08 19:15:11.718 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 08 19:15:11 compute-0 nova_compute[117514]: 2025-10-08 19:15:11.718 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 08 19:15:12 compute-0 nova_compute[117514]: 2025-10-08 19:15:12.199 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "refresh_cache-5e004931-f1db-408c-9f7a-6c6c50c5f8ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:15:12 compute-0 nova_compute[117514]: 2025-10-08 19:15:12.602 2 INFO nova.compute.manager [None req-47bb72c8-0d18-4fcf-aba2-3c694aa24a37 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Get console output
Oct 08 19:15:12 compute-0 nova_compute[117514]: 2025-10-08 19:15:12.609 54 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Oct 08 19:15:13 compute-0 nova_compute[117514]: 2025-10-08 19:15:13.374 2 DEBUG nova.network.neutron [req-54561677-c8aa-44ec-ae59-06183821da70 req-b0f08bc3-b63e-49e3-aa39-babd3007c281 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Updated VIF entry in instance network info cache for port ae9e7968-10b0-4606-9fa3-c91374cf1cc1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 08 19:15:13 compute-0 nova_compute[117514]: 2025-10-08 19:15:13.375 2 DEBUG nova.network.neutron [req-54561677-c8aa-44ec-ae59-06183821da70 req-b0f08bc3-b63e-49e3-aa39-babd3007c281 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Updating instance_info_cache with network_info: [{"id": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "address": "fa:16:3e:23:50:87", "network": {"id": "6826b0cb-7eaf-4468-bf17-e3c581bfc4ac", "bridge": "br-int", "label": "tempest-network-smoke--1582861562", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae9e7968-10", "ovs_interfaceid": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:15:13 compute-0 nova_compute[117514]: 2025-10-08 19:15:13.393 2 DEBUG oslo_concurrency.lockutils [req-54561677-c8aa-44ec-ae59-06183821da70 req-b0f08bc3-b63e-49e3-aa39-babd3007c281 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-5e004931-f1db-408c-9f7a-6c6c50c5f8ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:15:13 compute-0 nova_compute[117514]: 2025-10-08 19:15:13.395 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquired lock "refresh_cache-5e004931-f1db-408c-9f7a-6c6c50c5f8ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:15:13 compute-0 nova_compute[117514]: 2025-10-08 19:15:13.395 2 DEBUG nova.network.neutron [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Oct 08 19:15:13 compute-0 nova_compute[117514]: 2025-10-08 19:15:13.396 2 DEBUG nova.objects.instance [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 5e004931-f1db-408c-9f7a-6c6c50c5f8ef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 08 19:15:13 compute-0 nova_compute[117514]: 2025-10-08 19:15:13.661 2 DEBUG nova.compute.manager [req-757f53da-78be-43f6-8cc8-4e3610dd2568 req-007b4cb7-2009-4b31-9f0d-f8f66011b647 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Received event network-vif-unplugged-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:15:13 compute-0 nova_compute[117514]: 2025-10-08 19:15:13.662 2 DEBUG oslo_concurrency.lockutils [req-757f53da-78be-43f6-8cc8-4e3610dd2568 req-007b4cb7-2009-4b31-9f0d-f8f66011b647 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:15:13 compute-0 nova_compute[117514]: 2025-10-08 19:15:13.663 2 DEBUG oslo_concurrency.lockutils [req-757f53da-78be-43f6-8cc8-4e3610dd2568 req-007b4cb7-2009-4b31-9f0d-f8f66011b647 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:15:13 compute-0 nova_compute[117514]: 2025-10-08 19:15:13.663 2 DEBUG oslo_concurrency.lockutils [req-757f53da-78be-43f6-8cc8-4e3610dd2568 req-007b4cb7-2009-4b31-9f0d-f8f66011b647 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:15:13 compute-0 nova_compute[117514]: 2025-10-08 19:15:13.664 2 DEBUG nova.compute.manager [req-757f53da-78be-43f6-8cc8-4e3610dd2568 req-007b4cb7-2009-4b31-9f0d-f8f66011b647 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] No waiting events found dispatching network-vif-unplugged-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 08 19:15:13 compute-0 nova_compute[117514]: 2025-10-08 19:15:13.664 2 WARNING nova.compute.manager [req-757f53da-78be-43f6-8cc8-4e3610dd2568 req-007b4cb7-2009-4b31-9f0d-f8f66011b647 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Received unexpected event network-vif-unplugged-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 for instance with vm_state active and task_state None.
Oct 08 19:15:13 compute-0 nova_compute[117514]: 2025-10-08 19:15:13.665 2 DEBUG nova.compute.manager [req-757f53da-78be-43f6-8cc8-4e3610dd2568 req-007b4cb7-2009-4b31-9f0d-f8f66011b647 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Received event network-vif-plugged-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:15:13 compute-0 nova_compute[117514]: 2025-10-08 19:15:13.665 2 DEBUG oslo_concurrency.lockutils [req-757f53da-78be-43f6-8cc8-4e3610dd2568 req-007b4cb7-2009-4b31-9f0d-f8f66011b647 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:15:13 compute-0 nova_compute[117514]: 2025-10-08 19:15:13.666 2 DEBUG oslo_concurrency.lockutils [req-757f53da-78be-43f6-8cc8-4e3610dd2568 req-007b4cb7-2009-4b31-9f0d-f8f66011b647 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:15:13 compute-0 nova_compute[117514]: 2025-10-08 19:15:13.666 2 DEBUG oslo_concurrency.lockutils [req-757f53da-78be-43f6-8cc8-4e3610dd2568 req-007b4cb7-2009-4b31-9f0d-f8f66011b647 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:15:13 compute-0 nova_compute[117514]: 2025-10-08 19:15:13.667 2 DEBUG nova.compute.manager [req-757f53da-78be-43f6-8cc8-4e3610dd2568 req-007b4cb7-2009-4b31-9f0d-f8f66011b647 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] No waiting events found dispatching network-vif-plugged-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 08 19:15:13 compute-0 nova_compute[117514]: 2025-10-08 19:15:13.667 2 WARNING nova.compute.manager [req-757f53da-78be-43f6-8cc8-4e3610dd2568 req-007b4cb7-2009-4b31-9f0d-f8f66011b647 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Received unexpected event network-vif-plugged-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 for instance with vm_state active and task_state None.
Oct 08 19:15:14 compute-0 nova_compute[117514]: 2025-10-08 19:15:14.708 2 DEBUG nova.compute.manager [req-4d4b5d3f-bfd5-4876-8a22-9c955f3dfac5 req-e0b22032-c728-46d3-9aed-b54a8f80ae7d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Received event network-changed-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:15:14 compute-0 nova_compute[117514]: 2025-10-08 19:15:14.709 2 DEBUG nova.compute.manager [req-4d4b5d3f-bfd5-4876-8a22-9c955f3dfac5 req-e0b22032-c728-46d3-9aed-b54a8f80ae7d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Refreshing instance network info cache due to event network-changed-ae9e7968-10b0-4606-9fa3-c91374cf1cc1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 08 19:15:14 compute-0 nova_compute[117514]: 2025-10-08 19:15:14.709 2 DEBUG oslo_concurrency.lockutils [req-4d4b5d3f-bfd5-4876-8a22-9c955f3dfac5 req-e0b22032-c728-46d3-9aed-b54a8f80ae7d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-5e004931-f1db-408c-9f7a-6c6c50c5f8ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:15:14 compute-0 nova_compute[117514]: 2025-10-08 19:15:14.813 2 INFO nova.compute.manager [None req-948129db-58f7-42b5-be85-39f515f6e1f5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Get console output
Oct 08 19:15:14 compute-0 nova_compute[117514]: 2025-10-08 19:15:14.819 54 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Oct 08 19:15:14 compute-0 nova_compute[117514]: 2025-10-08 19:15:14.821 2 DEBUG nova.network.neutron [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Updating instance_info_cache with network_info: [{"id": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "address": "fa:16:3e:23:50:87", "network": {"id": "6826b0cb-7eaf-4468-bf17-e3c581bfc4ac", "bridge": "br-int", "label": "tempest-network-smoke--1582861562", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae9e7968-10", "ovs_interfaceid": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:15:14 compute-0 nova_compute[117514]: 2025-10-08 19:15:14.836 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Releasing lock "refresh_cache-5e004931-f1db-408c-9f7a-6c6c50c5f8ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:15:14 compute-0 nova_compute[117514]: 2025-10-08 19:15:14.837 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Oct 08 19:15:14 compute-0 nova_compute[117514]: 2025-10-08 19:15:14.837 2 DEBUG oslo_concurrency.lockutils [req-4d4b5d3f-bfd5-4876-8a22-9c955f3dfac5 req-e0b22032-c728-46d3-9aed-b54a8f80ae7d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-5e004931-f1db-408c-9f7a-6c6c50c5f8ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:15:14 compute-0 nova_compute[117514]: 2025-10-08 19:15:14.837 2 DEBUG nova.network.neutron [req-4d4b5d3f-bfd5-4876-8a22-9c955f3dfac5 req-e0b22032-c728-46d3-9aed-b54a8f80ae7d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Refreshing network info cache for port ae9e7968-10b0-4606-9fa3-c91374cf1cc1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 08 19:15:14 compute-0 nova_compute[117514]: 2025-10-08 19:15:14.840 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:15:14 compute-0 nova_compute[117514]: 2025-10-08 19:15:14.841 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:15:14 compute-0 nova_compute[117514]: 2025-10-08 19:15:14.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.683 2 DEBUG oslo_concurrency.lockutils [None req-bd6484cc-a417-474d-9237-3aec456f4243 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "2cd8a1e0-1eff-4f72-b839-340a50f3f21c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.683 2 DEBUG oslo_concurrency.lockutils [None req-bd6484cc-a417-474d-9237-3aec456f4243 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "2cd8a1e0-1eff-4f72-b839-340a50f3f21c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.683 2 DEBUG oslo_concurrency.lockutils [None req-bd6484cc-a417-474d-9237-3aec456f4243 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "2cd8a1e0-1eff-4f72-b839-340a50f3f21c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.684 2 DEBUG oslo_concurrency.lockutils [None req-bd6484cc-a417-474d-9237-3aec456f4243 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "2cd8a1e0-1eff-4f72-b839-340a50f3f21c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.684 2 DEBUG oslo_concurrency.lockutils [None req-bd6484cc-a417-474d-9237-3aec456f4243 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "2cd8a1e0-1eff-4f72-b839-340a50f3f21c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.685 2 INFO nova.compute.manager [None req-bd6484cc-a417-474d-9237-3aec456f4243 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Terminating instance
Oct 08 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.686 2 DEBUG nova.compute.manager [None req-bd6484cc-a417-474d-9237-3aec456f4243 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Oct 08 19:15:15 compute-0 kernel: tap2139e839-c6 (unregistering): left promiscuous mode
Oct 08 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:15:15 compute-0 NetworkManager[1035]: <info>  [1759950915.7200] device (tap2139e839-c6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 08 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:15 compute-0 ovn_controller[19759]: 2025-10-08T19:15:15Z|00148|binding|INFO|Releasing lport 2139e839-c698-494f-9fbc-5605baef1d1d from this chassis (sb_readonly=0)
Oct 08 19:15:15 compute-0 ovn_controller[19759]: 2025-10-08T19:15:15Z|00149|binding|INFO|Setting lport 2139e839-c698-494f-9fbc-5605baef1d1d down in Southbound
Oct 08 19:15:15 compute-0 ovn_controller[19759]: 2025-10-08T19:15:15Z|00150|binding|INFO|Removing iface tap2139e839-c6 ovn-installed in OVS
Oct 08 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.733 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:15 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:15.739 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:22:30:5a 10.100.0.6'], port_security=['fa:16:3e:22:30:5a 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '2cd8a1e0-1eff-4f72-b839-340a50f3f21c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9bd895d2-82c4-4fc5-81d5-e70c0a9516c6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=770536b4-68ae-4751-9b56-96d89b6bc561, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=2139e839-c698-494f-9fbc-5605baef1d1d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 08 19:15:15 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:15.741 28643 INFO neutron.agent.ovn.metadata.agent [-] Port 2139e839-c698-494f-9fbc-5605baef1d1d in datapath 6826b0cb-7eaf-4468-bf17-e3c581bfc4ac unbound from our chassis
Oct 08 19:15:15 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:15.743 28643 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6826b0cb-7eaf-4468-bf17-e3c581bfc4ac
Oct 08 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.758 2 DEBUG nova.compute.manager [req-97500e66-791d-4006-b8a5-283478a1dfde req-2e81e0ec-35d7-41a5-bfed-33976bae0a6c bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Received event network-vif-plugged-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.758 2 DEBUG oslo_concurrency.lockutils [req-97500e66-791d-4006-b8a5-283478a1dfde req-2e81e0ec-35d7-41a5-bfed-33976bae0a6c bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.758 2 DEBUG oslo_concurrency.lockutils [req-97500e66-791d-4006-b8a5-283478a1dfde req-2e81e0ec-35d7-41a5-bfed-33976bae0a6c bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.758 2 DEBUG oslo_concurrency.lockutils [req-97500e66-791d-4006-b8a5-283478a1dfde req-2e81e0ec-35d7-41a5-bfed-33976bae0a6c bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.759 2 DEBUG nova.compute.manager [req-97500e66-791d-4006-b8a5-283478a1dfde req-2e81e0ec-35d7-41a5-bfed-33976bae0a6c bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] No waiting events found dispatching network-vif-plugged-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 08 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.759 2 WARNING nova.compute.manager [req-97500e66-791d-4006-b8a5-283478a1dfde req-2e81e0ec-35d7-41a5-bfed-33976bae0a6c bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Received unexpected event network-vif-plugged-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 for instance with vm_state active and task_state None.
Oct 08 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.759 2 DEBUG nova.compute.manager [req-97500e66-791d-4006-b8a5-283478a1dfde req-2e81e0ec-35d7-41a5-bfed-33976bae0a6c bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Received event network-vif-plugged-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.759 2 DEBUG oslo_concurrency.lockutils [req-97500e66-791d-4006-b8a5-283478a1dfde req-2e81e0ec-35d7-41a5-bfed-33976bae0a6c bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.759 2 DEBUG oslo_concurrency.lockutils [req-97500e66-791d-4006-b8a5-283478a1dfde req-2e81e0ec-35d7-41a5-bfed-33976bae0a6c bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.759 2 DEBUG oslo_concurrency.lockutils [req-97500e66-791d-4006-b8a5-283478a1dfde req-2e81e0ec-35d7-41a5-bfed-33976bae0a6c bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.759 2 DEBUG nova.compute.manager [req-97500e66-791d-4006-b8a5-283478a1dfde req-2e81e0ec-35d7-41a5-bfed-33976bae0a6c bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] No waiting events found dispatching network-vif-plugged-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 08 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.760 2 WARNING nova.compute.manager [req-97500e66-791d-4006-b8a5-283478a1dfde req-2e81e0ec-35d7-41a5-bfed-33976bae0a6c bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Received unexpected event network-vif-plugged-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 for instance with vm_state active and task_state None.
Oct 08 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.761 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:15 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:15.775 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[efbf1580-5f18-4106-9d0c-b6bab3013b54]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:15:15 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Oct 08 19:15:15 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000000c.scope: Consumed 13.444s CPU time.
Oct 08 19:15:15 compute-0 systemd-machined[77568]: Machine qemu-12-instance-0000000c terminated.
Oct 08 19:15:15 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:15.815 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[69b0faff-efa0-456c-9fe0-e9aca62e53d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:15:15 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:15.821 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[cd5f187f-017c-464b-8a4a-afbab476ba4b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:15:15 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:15.868 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[bd0d32c2-acb6-41a5-9998-29fac91f4350]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:15:15 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:15.900 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[84ff2abd-02cc-4f3f-a799-cc8aa38ebdff]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6826b0cb-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:25:04:2d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 8, 'rx_bytes': 1000, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 8, 'rx_bytes': 1000, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 50], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 154024, 'reachable_time': 41558, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 150595, 'error': None, 'target': 'ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.922 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:15 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:15.930 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[62d58742-c32a-4f2b-b821-32e8d78cde8d]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap6826b0cb-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 154039, 'tstamp': 154039}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 150598, 'error': None, 'target': 'ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap6826b0cb-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 154043, 'tstamp': 154043}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 150598, 'error': None, 'target': 'ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:15:15 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:15.932 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6826b0cb-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.934 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.945 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:15 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:15.946 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6826b0cb-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:15:15 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:15.947 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 08 19:15:15 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:15.947 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6826b0cb-70, col_values=(('external_ids', {'iface-id': 'eabc4672-d176-4f11-b5f6-bcbea840c3e8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:15:15 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:15.948 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 08 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.968 2 INFO nova.virt.libvirt.driver [-] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Instance destroyed successfully.
Oct 08 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.969 2 DEBUG nova.objects.instance [None req-bd6484cc-a417-474d-9237-3aec456f4243 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'resources' on Instance uuid 2cd8a1e0-1eff-4f72-b839-340a50f3f21c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 08 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.983 2 DEBUG nova.virt.libvirt.vif [None req-bd6484cc-a417-474d-9237-3aec456f4243 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T19:14:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2104709800',display_name='tempest-TestNetworkBasicOps-server-2104709800',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2104709800',id=12,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD7KRl2SW48tLsDGtdUZXstQI0RJAgkIMeGypW4KhorPNM5dX0aheM9ROODmr544NnSbnVhZPkTpmB3kqR7fi9vzFVS1BaUwNIB2s1Cu3kNzwW4pHA+avxmDokcR+QqgSQ==',key_name='tempest-TestNetworkBasicOps-1494317570',keypairs=<?>,launch_index=0,launched_at=2025-10-08T19:14:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-vtc0uukp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T19:14:46Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=2cd8a1e0-1eff-4f72-b839-340a50f3f21c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2139e839-c698-494f-9fbc-5605baef1d1d", "address": "fa:16:3e:22:30:5a", "network": {"id": "6826b0cb-7eaf-4468-bf17-e3c581bfc4ac", "bridge": "br-int", "label": "tempest-network-smoke--1582861562", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.198", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2139e839-c6", "ovs_interfaceid": "2139e839-c698-494f-9fbc-5605baef1d1d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Oct 08 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.983 2 DEBUG nova.network.os_vif_util [None req-bd6484cc-a417-474d-9237-3aec456f4243 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "2139e839-c698-494f-9fbc-5605baef1d1d", "address": "fa:16:3e:22:30:5a", "network": {"id": "6826b0cb-7eaf-4468-bf17-e3c581bfc4ac", "bridge": "br-int", "label": "tempest-network-smoke--1582861562", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.198", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2139e839-c6", "ovs_interfaceid": "2139e839-c698-494f-9fbc-5605baef1d1d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 08 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.985 2 DEBUG nova.network.os_vif_util [None req-bd6484cc-a417-474d-9237-3aec456f4243 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:22:30:5a,bridge_name='br-int',has_traffic_filtering=True,id=2139e839-c698-494f-9fbc-5605baef1d1d,network=Network(6826b0cb-7eaf-4468-bf17-e3c581bfc4ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2139e839-c6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 08 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.985 2 DEBUG os_vif [None req-bd6484cc-a417-474d-9237-3aec456f4243 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:22:30:5a,bridge_name='br-int',has_traffic_filtering=True,id=2139e839-c698-494f-9fbc-5605baef1d1d,network=Network(6826b0cb-7eaf-4468-bf17-e3c581bfc4ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2139e839-c6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Oct 08 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.988 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2139e839-c6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.990 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.993 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 08 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.996 2 INFO os_vif [None req-bd6484cc-a417-474d-9237-3aec456f4243 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:22:30:5a,bridge_name='br-int',has_traffic_filtering=True,id=2139e839-c698-494f-9fbc-5605baef1d1d,network=Network(6826b0cb-7eaf-4468-bf17-e3c581bfc4ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2139e839-c6')
Oct 08 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.997 2 INFO nova.virt.libvirt.driver [None req-bd6484cc-a417-474d-9237-3aec456f4243 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Deleting instance files /var/lib/nova/instances/2cd8a1e0-1eff-4f72-b839-340a50f3f21c_del
Oct 08 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.998 2 INFO nova.virt.libvirt.driver [None req-bd6484cc-a417-474d-9237-3aec456f4243 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Deletion of /var/lib/nova/instances/2cd8a1e0-1eff-4f72-b839-340a50f3f21c_del complete
Oct 08 19:15:16 compute-0 nova_compute[117514]: 2025-10-08 19:15:16.055 2 INFO nova.compute.manager [None req-bd6484cc-a417-474d-9237-3aec456f4243 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Took 0.37 seconds to destroy the instance on the hypervisor.
Oct 08 19:15:16 compute-0 nova_compute[117514]: 2025-10-08 19:15:16.056 2 DEBUG oslo.service.loopingcall [None req-bd6484cc-a417-474d-9237-3aec456f4243 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Oct 08 19:15:16 compute-0 nova_compute[117514]: 2025-10-08 19:15:16.056 2 DEBUG nova.compute.manager [-] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Oct 08 19:15:16 compute-0 nova_compute[117514]: 2025-10-08 19:15:16.056 2 DEBUG nova.network.neutron [-] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Oct 08 19:15:16 compute-0 nova_compute[117514]: 2025-10-08 19:15:16.222 2 DEBUG nova.network.neutron [req-4d4b5d3f-bfd5-4876-8a22-9c955f3dfac5 req-e0b22032-c728-46d3-9aed-b54a8f80ae7d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Updated VIF entry in instance network info cache for port ae9e7968-10b0-4606-9fa3-c91374cf1cc1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 08 19:15:16 compute-0 nova_compute[117514]: 2025-10-08 19:15:16.223 2 DEBUG nova.network.neutron [req-4d4b5d3f-bfd5-4876-8a22-9c955f3dfac5 req-e0b22032-c728-46d3-9aed-b54a8f80ae7d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Updating instance_info_cache with network_info: [{"id": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "address": "fa:16:3e:23:50:87", "network": {"id": "6826b0cb-7eaf-4468-bf17-e3c581bfc4ac", "bridge": "br-int", "label": "tempest-network-smoke--1582861562", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae9e7968-10", "ovs_interfaceid": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:15:16 compute-0 nova_compute[117514]: 2025-10-08 19:15:16.238 2 DEBUG oslo_concurrency.lockutils [req-4d4b5d3f-bfd5-4876-8a22-9c955f3dfac5 req-e0b22032-c728-46d3-9aed-b54a8f80ae7d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-5e004931-f1db-408c-9f7a-6c6c50c5f8ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:15:16 compute-0 nova_compute[117514]: 2025-10-08 19:15:16.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:16 compute-0 nova_compute[117514]: 2025-10-08 19:15:16.693 2 DEBUG nova.network.neutron [-] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:15:16 compute-0 nova_compute[117514]: 2025-10-08 19:15:16.711 2 INFO nova.compute.manager [-] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Took 0.65 seconds to deallocate network for instance.
Oct 08 19:15:16 compute-0 nova_compute[117514]: 2025-10-08 19:15:16.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:15:16 compute-0 nova_compute[117514]: 2025-10-08 19:15:16.754 2 DEBUG oslo_concurrency.lockutils [None req-bd6484cc-a417-474d-9237-3aec456f4243 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:15:16 compute-0 nova_compute[117514]: 2025-10-08 19:15:16.755 2 DEBUG oslo_concurrency.lockutils [None req-bd6484cc-a417-474d-9237-3aec456f4243 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:15:16 compute-0 nova_compute[117514]: 2025-10-08 19:15:16.779 2 DEBUG nova.compute.manager [req-1b959095-0c45-4df7-89e0-099cf01f73ef req-98c395f5-0578-4c38-877c-3f2740c177b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Received event network-changed-2139e839-c698-494f-9fbc-5605baef1d1d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:15:16 compute-0 nova_compute[117514]: 2025-10-08 19:15:16.779 2 DEBUG nova.compute.manager [req-1b959095-0c45-4df7-89e0-099cf01f73ef req-98c395f5-0578-4c38-877c-3f2740c177b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Refreshing instance network info cache due to event network-changed-2139e839-c698-494f-9fbc-5605baef1d1d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 08 19:15:16 compute-0 nova_compute[117514]: 2025-10-08 19:15:16.780 2 DEBUG oslo_concurrency.lockutils [req-1b959095-0c45-4df7-89e0-099cf01f73ef req-98c395f5-0578-4c38-877c-3f2740c177b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-2cd8a1e0-1eff-4f72-b839-340a50f3f21c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:15:16 compute-0 nova_compute[117514]: 2025-10-08 19:15:16.780 2 DEBUG oslo_concurrency.lockutils [req-1b959095-0c45-4df7-89e0-099cf01f73ef req-98c395f5-0578-4c38-877c-3f2740c177b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-2cd8a1e0-1eff-4f72-b839-340a50f3f21c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:15:16 compute-0 nova_compute[117514]: 2025-10-08 19:15:16.781 2 DEBUG nova.network.neutron [req-1b959095-0c45-4df7-89e0-099cf01f73ef req-98c395f5-0578-4c38-877c-3f2740c177b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Refreshing network info cache for port 2139e839-c698-494f-9fbc-5605baef1d1d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 08 19:15:16 compute-0 nova_compute[117514]: 2025-10-08 19:15:16.843 2 DEBUG nova.compute.provider_tree [None req-bd6484cc-a417-474d-9237-3aec456f4243 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 08 19:15:16 compute-0 nova_compute[117514]: 2025-10-08 19:15:16.864 2 DEBUG nova.scheduler.client.report [None req-bd6484cc-a417-474d-9237-3aec456f4243 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 08 19:15:16 compute-0 nova_compute[117514]: 2025-10-08 19:15:16.885 2 DEBUG oslo_concurrency.lockutils [None req-bd6484cc-a417-474d-9237-3aec456f4243 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:15:16 compute-0 nova_compute[117514]: 2025-10-08 19:15:16.909 2 INFO nova.scheduler.client.report [None req-bd6484cc-a417-474d-9237-3aec456f4243 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Deleted allocations for instance 2cd8a1e0-1eff-4f72-b839-340a50f3f21c
Oct 08 19:15:16 compute-0 nova_compute[117514]: 2025-10-08 19:15:16.912 2 DEBUG nova.network.neutron [req-1b959095-0c45-4df7-89e0-099cf01f73ef req-98c395f5-0578-4c38-877c-3f2740c177b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Oct 08 19:15:16 compute-0 nova_compute[117514]: 2025-10-08 19:15:16.975 2 DEBUG oslo_concurrency.lockutils [None req-bd6484cc-a417-474d-9237-3aec456f4243 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "2cd8a1e0-1eff-4f72-b839-340a50f3f21c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.292s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:15:17 compute-0 nova_compute[117514]: 2025-10-08 19:15:17.199 2 DEBUG nova.network.neutron [req-1b959095-0c45-4df7-89e0-099cf01f73ef req-98c395f5-0578-4c38-877c-3f2740c177b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:15:17 compute-0 nova_compute[117514]: 2025-10-08 19:15:17.216 2 DEBUG oslo_concurrency.lockutils [req-1b959095-0c45-4df7-89e0-099cf01f73ef req-98c395f5-0578-4c38-877c-3f2740c177b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-2cd8a1e0-1eff-4f72-b839-340a50f3f21c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:15:17 compute-0 nova_compute[117514]: 2025-10-08 19:15:17.216 2 DEBUG nova.compute.manager [req-1b959095-0c45-4df7-89e0-099cf01f73ef req-98c395f5-0578-4c38-877c-3f2740c177b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Received event network-vif-unplugged-2139e839-c698-494f-9fbc-5605baef1d1d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:15:17 compute-0 nova_compute[117514]: 2025-10-08 19:15:17.217 2 DEBUG oslo_concurrency.lockutils [req-1b959095-0c45-4df7-89e0-099cf01f73ef req-98c395f5-0578-4c38-877c-3f2740c177b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "2cd8a1e0-1eff-4f72-b839-340a50f3f21c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:15:17 compute-0 nova_compute[117514]: 2025-10-08 19:15:17.218 2 DEBUG oslo_concurrency.lockutils [req-1b959095-0c45-4df7-89e0-099cf01f73ef req-98c395f5-0578-4c38-877c-3f2740c177b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "2cd8a1e0-1eff-4f72-b839-340a50f3f21c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:15:17 compute-0 nova_compute[117514]: 2025-10-08 19:15:17.218 2 DEBUG oslo_concurrency.lockutils [req-1b959095-0c45-4df7-89e0-099cf01f73ef req-98c395f5-0578-4c38-877c-3f2740c177b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "2cd8a1e0-1eff-4f72-b839-340a50f3f21c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:15:17 compute-0 nova_compute[117514]: 2025-10-08 19:15:17.219 2 DEBUG nova.compute.manager [req-1b959095-0c45-4df7-89e0-099cf01f73ef req-98c395f5-0578-4c38-877c-3f2740c177b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] No waiting events found dispatching network-vif-unplugged-2139e839-c698-494f-9fbc-5605baef1d1d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 08 19:15:17 compute-0 nova_compute[117514]: 2025-10-08 19:15:17.220 2 WARNING nova.compute.manager [req-1b959095-0c45-4df7-89e0-099cf01f73ef req-98c395f5-0578-4c38-877c-3f2740c177b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Received unexpected event network-vif-unplugged-2139e839-c698-494f-9fbc-5605baef1d1d for instance with vm_state deleted and task_state None.
Oct 08 19:15:17 compute-0 nova_compute[117514]: 2025-10-08 19:15:17.220 2 DEBUG nova.compute.manager [req-1b959095-0c45-4df7-89e0-099cf01f73ef req-98c395f5-0578-4c38-877c-3f2740c177b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Received event network-vif-plugged-2139e839-c698-494f-9fbc-5605baef1d1d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:15:17 compute-0 nova_compute[117514]: 2025-10-08 19:15:17.221 2 DEBUG oslo_concurrency.lockutils [req-1b959095-0c45-4df7-89e0-099cf01f73ef req-98c395f5-0578-4c38-877c-3f2740c177b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "2cd8a1e0-1eff-4f72-b839-340a50f3f21c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:15:17 compute-0 nova_compute[117514]: 2025-10-08 19:15:17.221 2 DEBUG oslo_concurrency.lockutils [req-1b959095-0c45-4df7-89e0-099cf01f73ef req-98c395f5-0578-4c38-877c-3f2740c177b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "2cd8a1e0-1eff-4f72-b839-340a50f3f21c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:15:17 compute-0 nova_compute[117514]: 2025-10-08 19:15:17.222 2 DEBUG oslo_concurrency.lockutils [req-1b959095-0c45-4df7-89e0-099cf01f73ef req-98c395f5-0578-4c38-877c-3f2740c177b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "2cd8a1e0-1eff-4f72-b839-340a50f3f21c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:15:17 compute-0 nova_compute[117514]: 2025-10-08 19:15:17.223 2 DEBUG nova.compute.manager [req-1b959095-0c45-4df7-89e0-099cf01f73ef req-98c395f5-0578-4c38-877c-3f2740c177b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] No waiting events found dispatching network-vif-plugged-2139e839-c698-494f-9fbc-5605baef1d1d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 08 19:15:17 compute-0 nova_compute[117514]: 2025-10-08 19:15:17.223 2 WARNING nova.compute.manager [req-1b959095-0c45-4df7-89e0-099cf01f73ef req-98c395f5-0578-4c38-877c-3f2740c177b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Received unexpected event network-vif-plugged-2139e839-c698-494f-9fbc-5605baef1d1d for instance with vm_state deleted and task_state None.
Oct 08 19:15:17 compute-0 nova_compute[117514]: 2025-10-08 19:15:17.224 2 DEBUG nova.compute.manager [req-1b959095-0c45-4df7-89e0-099cf01f73ef req-98c395f5-0578-4c38-877c-3f2740c177b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Received event network-vif-deleted-2139e839-c698-494f-9fbc-5605baef1d1d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:15:17 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:17.434 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f81f7a-64d8-418a-a74c-b879bd6deb83, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:15:17 compute-0 nova_compute[117514]: 2025-10-08 19:15:17.715 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:15:18 compute-0 nova_compute[117514]: 2025-10-08 19:15:18.761 2 DEBUG oslo_concurrency.lockutils [None req-f23efa21-17b0-445c-9927-af056d4d06b7 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:15:18 compute-0 nova_compute[117514]: 2025-10-08 19:15:18.762 2 DEBUG oslo_concurrency.lockutils [None req-f23efa21-17b0-445c-9927-af056d4d06b7 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:15:18 compute-0 nova_compute[117514]: 2025-10-08 19:15:18.763 2 DEBUG oslo_concurrency.lockutils [None req-f23efa21-17b0-445c-9927-af056d4d06b7 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:15:18 compute-0 nova_compute[117514]: 2025-10-08 19:15:18.763 2 DEBUG oslo_concurrency.lockutils [None req-f23efa21-17b0-445c-9927-af056d4d06b7 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:15:18 compute-0 nova_compute[117514]: 2025-10-08 19:15:18.764 2 DEBUG oslo_concurrency.lockutils [None req-f23efa21-17b0-445c-9927-af056d4d06b7 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:15:18 compute-0 nova_compute[117514]: 2025-10-08 19:15:18.766 2 INFO nova.compute.manager [None req-f23efa21-17b0-445c-9927-af056d4d06b7 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Terminating instance
Oct 08 19:15:18 compute-0 nova_compute[117514]: 2025-10-08 19:15:18.768 2 DEBUG nova.compute.manager [None req-f23efa21-17b0-445c-9927-af056d4d06b7 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Oct 08 19:15:18 compute-0 kernel: tapae9e7968-10 (unregistering): left promiscuous mode
Oct 08 19:15:18 compute-0 NetworkManager[1035]: <info>  [1759950918.7959] device (tapae9e7968-10): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 08 19:15:18 compute-0 ovn_controller[19759]: 2025-10-08T19:15:18Z|00151|binding|INFO|Releasing lport ae9e7968-10b0-4606-9fa3-c91374cf1cc1 from this chassis (sb_readonly=0)
Oct 08 19:15:18 compute-0 ovn_controller[19759]: 2025-10-08T19:15:18Z|00152|binding|INFO|Setting lport ae9e7968-10b0-4606-9fa3-c91374cf1cc1 down in Southbound
Oct 08 19:15:18 compute-0 ovn_controller[19759]: 2025-10-08T19:15:18Z|00153|binding|INFO|Removing iface tapae9e7968-10 ovn-installed in OVS
Oct 08 19:15:18 compute-0 nova_compute[117514]: 2025-10-08 19:15:18.809 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:18 compute-0 nova_compute[117514]: 2025-10-08 19:15:18.814 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:18 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:18.819 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:23:50:87 10.100.0.4'], port_security=['fa:16:3e:23:50:87 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '5e004931-f1db-408c-9f7a-6c6c50c5f8ef', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'c3b607ea-9253-4328-bb00-668338c7a25d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=770536b4-68ae-4751-9b56-96d89b6bc561, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=ae9e7968-10b0-4606-9fa3-c91374cf1cc1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 08 19:15:18 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:18.822 28643 INFO neutron.agent.ovn.metadata.agent [-] Port ae9e7968-10b0-4606-9fa3-c91374cf1cc1 in datapath 6826b0cb-7eaf-4468-bf17-e3c581bfc4ac unbound from our chassis
Oct 08 19:15:18 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:18.825 28643 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6826b0cb-7eaf-4468-bf17-e3c581bfc4ac, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 08 19:15:18 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:18.830 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[a1351724-bc88-4969-a0d8-e347bb32ae15]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:15:18 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:18.831 28643 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac namespace which is not needed anymore
Oct 08 19:15:18 compute-0 nova_compute[117514]: 2025-10-08 19:15:18.841 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:18 compute-0 nova_compute[117514]: 2025-10-08 19:15:18.879 2 DEBUG nova.compute.manager [req-00b34679-df5a-49ec-91f6-3b0f8940b1b5 req-1ed3953d-d099-40df-bfe7-1c8818fcaccb bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Received event network-changed-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:15:18 compute-0 nova_compute[117514]: 2025-10-08 19:15:18.881 2 DEBUG nova.compute.manager [req-00b34679-df5a-49ec-91f6-3b0f8940b1b5 req-1ed3953d-d099-40df-bfe7-1c8818fcaccb bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Refreshing instance network info cache due to event network-changed-ae9e7968-10b0-4606-9fa3-c91374cf1cc1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 08 19:15:18 compute-0 nova_compute[117514]: 2025-10-08 19:15:18.881 2 DEBUG oslo_concurrency.lockutils [req-00b34679-df5a-49ec-91f6-3b0f8940b1b5 req-1ed3953d-d099-40df-bfe7-1c8818fcaccb bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-5e004931-f1db-408c-9f7a-6c6c50c5f8ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:15:18 compute-0 nova_compute[117514]: 2025-10-08 19:15:18.881 2 DEBUG oslo_concurrency.lockutils [req-00b34679-df5a-49ec-91f6-3b0f8940b1b5 req-1ed3953d-d099-40df-bfe7-1c8818fcaccb bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-5e004931-f1db-408c-9f7a-6c6c50c5f8ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:15:18 compute-0 nova_compute[117514]: 2025-10-08 19:15:18.881 2 DEBUG nova.network.neutron [req-00b34679-df5a-49ec-91f6-3b0f8940b1b5 req-1ed3953d-d099-40df-bfe7-1c8818fcaccb bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Refreshing network info cache for port ae9e7968-10b0-4606-9fa3-c91374cf1cc1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 08 19:15:18 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Oct 08 19:15:18 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000b.scope: Consumed 14.004s CPU time.
Oct 08 19:15:18 compute-0 systemd-machined[77568]: Machine qemu-11-instance-0000000b terminated.
Oct 08 19:15:18 compute-0 neutron-haproxy-ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac[150173]: [NOTICE]   (150177) : haproxy version is 2.8.14-c23fe91
Oct 08 19:15:18 compute-0 neutron-haproxy-ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac[150173]: [NOTICE]   (150177) : path to executable is /usr/sbin/haproxy
Oct 08 19:15:18 compute-0 neutron-haproxy-ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac[150173]: [WARNING]  (150177) : Exiting Master process...
Oct 08 19:15:18 compute-0 neutron-haproxy-ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac[150173]: [ALERT]    (150177) : Current worker (150179) exited with code 143 (Terminated)
Oct 08 19:15:18 compute-0 neutron-haproxy-ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac[150173]: [WARNING]  (150177) : All workers exited. Exiting... (0)
Oct 08 19:15:18 compute-0 systemd[1]: libpod-fd1984bbb04b2d37335257684b95c287df155b6e2e004efa2c044877a08a6b2b.scope: Deactivated successfully.
Oct 08 19:15:18 compute-0 podman[150637]: 2025-10-08 19:15:18.979531394 +0000 UTC m=+0.046479821 container died fd1984bbb04b2d37335257684b95c287df155b6e2e004efa2c044877a08a6b2b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team)
Oct 08 19:15:18 compute-0 nova_compute[117514]: 2025-10-08 19:15:18.991 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:18 compute-0 nova_compute[117514]: 2025-10-08 19:15:18.996 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:19 compute-0 systemd[1]: var-lib-containers-storage-overlay-d6837e9d7ec807fa0b782d57371b150b504d3cac3a36af379f52c92234713c15-merged.mount: Deactivated successfully.
Oct 08 19:15:19 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fd1984bbb04b2d37335257684b95c287df155b6e2e004efa2c044877a08a6b2b-userdata-shm.mount: Deactivated successfully.
Oct 08 19:15:19 compute-0 podman[150637]: 2025-10-08 19:15:19.038778192 +0000 UTC m=+0.105726589 container cleanup fd1984bbb04b2d37335257684b95c287df155b6e2e004efa2c044877a08a6b2b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Oct 08 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.038 2 INFO nova.virt.libvirt.driver [-] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Instance destroyed successfully.
Oct 08 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.039 2 DEBUG nova.objects.instance [None req-f23efa21-17b0-445c-9927-af056d4d06b7 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'resources' on Instance uuid 5e004931-f1db-408c-9f7a-6c6c50c5f8ef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 08 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.052 2 DEBUG nova.virt.libvirt.vif [None req-f23efa21-17b0-445c-9927-af056d4d06b7 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T19:14:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-103133275',display_name='tempest-TestNetworkBasicOps-server-103133275',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-103133275',id=11,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI5y6d80fHySET4pCbLeqyj0cyDTZn6hTOGziG7pCiD92qFDw7Uq+y0suIKpGvDK2QOm6VBv2vJI5Io6WjjxteICCSlzmOgxu+CdOrYx2YA1B+bI4ndO5c+cp00qcb4ncw==',key_name='tempest-TestNetworkBasicOps-286586540',keypairs=<?>,launch_index=0,launched_at=2025-10-08T19:14:30Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-cyi34c6v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T19:14:30Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=5e004931-f1db-408c-9f7a-6c6c50c5f8ef,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "address": "fa:16:3e:23:50:87", "network": {"id": "6826b0cb-7eaf-4468-bf17-e3c581bfc4ac", "bridge": "br-int", "label": "tempest-network-smoke--1582861562", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae9e7968-10", "ovs_interfaceid": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Oct 08 19:15:19 compute-0 systemd[1]: libpod-conmon-fd1984bbb04b2d37335257684b95c287df155b6e2e004efa2c044877a08a6b2b.scope: Deactivated successfully.
Oct 08 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.053 2 DEBUG nova.network.os_vif_util [None req-f23efa21-17b0-445c-9927-af056d4d06b7 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "address": "fa:16:3e:23:50:87", "network": {"id": "6826b0cb-7eaf-4468-bf17-e3c581bfc4ac", "bridge": "br-int", "label": "tempest-network-smoke--1582861562", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae9e7968-10", "ovs_interfaceid": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 08 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.054 2 DEBUG nova.network.os_vif_util [None req-f23efa21-17b0-445c-9927-af056d4d06b7 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:23:50:87,bridge_name='br-int',has_traffic_filtering=True,id=ae9e7968-10b0-4606-9fa3-c91374cf1cc1,network=Network(6826b0cb-7eaf-4468-bf17-e3c581bfc4ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae9e7968-10') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 08 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.054 2 DEBUG os_vif [None req-f23efa21-17b0-445c-9927-af056d4d06b7 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:23:50:87,bridge_name='br-int',has_traffic_filtering=True,id=ae9e7968-10b0-4606-9fa3-c91374cf1cc1,network=Network(6826b0cb-7eaf-4468-bf17-e3c581bfc4ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae9e7968-10') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Oct 08 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.056 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.057 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapae9e7968-10, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.060 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.064 2 INFO os_vif [None req-f23efa21-17b0-445c-9927-af056d4d06b7 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:23:50:87,bridge_name='br-int',has_traffic_filtering=True,id=ae9e7968-10b0-4606-9fa3-c91374cf1cc1,network=Network(6826b0cb-7eaf-4468-bf17-e3c581bfc4ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae9e7968-10')
Oct 08 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.065 2 INFO nova.virt.libvirt.driver [None req-f23efa21-17b0-445c-9927-af056d4d06b7 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Deleting instance files /var/lib/nova/instances/5e004931-f1db-408c-9f7a-6c6c50c5f8ef_del
Oct 08 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.066 2 INFO nova.virt.libvirt.driver [None req-f23efa21-17b0-445c-9927-af056d4d06b7 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Deletion of /var/lib/nova/instances/5e004931-f1db-408c-9f7a-6c6c50c5f8ef_del complete
Oct 08 19:15:19 compute-0 podman[150684]: 2025-10-08 19:15:19.113007442 +0000 UTC m=+0.048313804 container remove fd1984bbb04b2d37335257684b95c287df155b6e2e004efa2c044877a08a6b2b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 08 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.113 2 INFO nova.compute.manager [None req-f23efa21-17b0-445c-9927-af056d4d06b7 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Took 0.34 seconds to destroy the instance on the hypervisor.
Oct 08 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.113 2 DEBUG oslo.service.loopingcall [None req-f23efa21-17b0-445c-9927-af056d4d06b7 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Oct 08 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.114 2 DEBUG nova.compute.manager [-] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Oct 08 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.114 2 DEBUG nova.network.neutron [-] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Oct 08 19:15:19 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:19.120 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[4c9f01e3-0cfe-42ab-b1c3-ded22fc45547]: (4, ('Wed Oct  8 07:15:18 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac (fd1984bbb04b2d37335257684b95c287df155b6e2e004efa2c044877a08a6b2b)\nfd1984bbb04b2d37335257684b95c287df155b6e2e004efa2c044877a08a6b2b\nWed Oct  8 07:15:19 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac (fd1984bbb04b2d37335257684b95c287df155b6e2e004efa2c044877a08a6b2b)\nfd1984bbb04b2d37335257684b95c287df155b6e2e004efa2c044877a08a6b2b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:15:19 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:19.122 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[aac412aa-0e03-47af-a077-d0a1cb6f7e52]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:15:19 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:19.123 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6826b0cb-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.125 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:19 compute-0 kernel: tap6826b0cb-70: left promiscuous mode
Oct 08 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.149 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:19 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:19.154 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[385b4dd4-0246-4474-b917-863e4a20df72]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:15:19 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:19.187 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[77f3bc3d-3bc1-49ef-83f7-6e7649b84a24]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:15:19 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:19.189 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[92fa3b3d-ed4d-4901-8643-21aa951f7078]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:15:19 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:19.209 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[30261ff5-3392-4bf2-8dec-3a579754bdf3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 154017, 'reachable_time': 32844, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 150700, 'error': None, 'target': 'ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:15:19 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:19.213 28783 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Oct 08 19:15:19 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:19.213 28783 DEBUG oslo.privsep.daemon [-] privsep: reply[d11a7bf6-9037-4b7b-a17d-9b3fd5598bbb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:15:19 compute-0 systemd[1]: run-netns-ovnmeta\x2d6826b0cb\x2d7eaf\x2d4468\x2dbf17\x2de3c581bfc4ac.mount: Deactivated successfully.
Oct 08 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.262 2 DEBUG nova.compute.manager [req-4db90871-fc73-4108-8839-ed9842217211 req-168b42c7-1d0d-42e0-af0f-0a9ac648938b bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Received event network-vif-unplugged-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.263 2 DEBUG oslo_concurrency.lockutils [req-4db90871-fc73-4108-8839-ed9842217211 req-168b42c7-1d0d-42e0-af0f-0a9ac648938b bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.263 2 DEBUG oslo_concurrency.lockutils [req-4db90871-fc73-4108-8839-ed9842217211 req-168b42c7-1d0d-42e0-af0f-0a9ac648938b bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.263 2 DEBUG oslo_concurrency.lockutils [req-4db90871-fc73-4108-8839-ed9842217211 req-168b42c7-1d0d-42e0-af0f-0a9ac648938b bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.263 2 DEBUG nova.compute.manager [req-4db90871-fc73-4108-8839-ed9842217211 req-168b42c7-1d0d-42e0-af0f-0a9ac648938b bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] No waiting events found dispatching network-vif-unplugged-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 08 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.264 2 DEBUG nova.compute.manager [req-4db90871-fc73-4108-8839-ed9842217211 req-168b42c7-1d0d-42e0-af0f-0a9ac648938b bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Received event network-vif-unplugged-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Oct 08 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.801 2 DEBUG nova.network.neutron [-] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.816 2 INFO nova.compute.manager [-] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Took 0.70 seconds to deallocate network for instance.
Oct 08 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.865 2 DEBUG oslo_concurrency.lockutils [None req-f23efa21-17b0-445c-9927-af056d4d06b7 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.865 2 DEBUG oslo_concurrency.lockutils [None req-f23efa21-17b0-445c-9927-af056d4d06b7 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.913 2 DEBUG nova.compute.provider_tree [None req-f23efa21-17b0-445c-9927-af056d4d06b7 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 08 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.928 2 DEBUG nova.scheduler.client.report [None req-f23efa21-17b0-445c-9927-af056d4d06b7 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 08 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.951 2 DEBUG oslo_concurrency.lockutils [None req-f23efa21-17b0-445c-9927-af056d4d06b7 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.085s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.973 2 INFO nova.scheduler.client.report [None req-f23efa21-17b0-445c-9927-af056d4d06b7 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Deleted allocations for instance 5e004931-f1db-408c-9f7a-6c6c50c5f8ef
Oct 08 19:15:20 compute-0 nova_compute[117514]: 2025-10-08 19:15:20.037 2 DEBUG oslo_concurrency.lockutils [None req-f23efa21-17b0-445c-9927-af056d4d06b7 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.275s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:15:20 compute-0 nova_compute[117514]: 2025-10-08 19:15:20.233 2 DEBUG nova.network.neutron [req-00b34679-df5a-49ec-91f6-3b0f8940b1b5 req-1ed3953d-d099-40df-bfe7-1c8818fcaccb bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Updated VIF entry in instance network info cache for port ae9e7968-10b0-4606-9fa3-c91374cf1cc1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 08 19:15:20 compute-0 nova_compute[117514]: 2025-10-08 19:15:20.233 2 DEBUG nova.network.neutron [req-00b34679-df5a-49ec-91f6-3b0f8940b1b5 req-1ed3953d-d099-40df-bfe7-1c8818fcaccb bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Updating instance_info_cache with network_info: [{"id": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "address": "fa:16:3e:23:50:87", "network": {"id": "6826b0cb-7eaf-4468-bf17-e3c581bfc4ac", "bridge": "br-int", "label": "tempest-network-smoke--1582861562", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae9e7968-10", "ovs_interfaceid": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:15:20 compute-0 nova_compute[117514]: 2025-10-08 19:15:20.254 2 DEBUG oslo_concurrency.lockutils [req-00b34679-df5a-49ec-91f6-3b0f8940b1b5 req-1ed3953d-d099-40df-bfe7-1c8818fcaccb bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-5e004931-f1db-408c-9f7a-6c6c50c5f8ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:15:21 compute-0 nova_compute[117514]: 2025-10-08 19:15:21.354 2 DEBUG nova.compute.manager [req-48566298-102e-41f1-84cb-ba28998cbea0 req-8c223513-6006-41fc-b52a-f18526bd888c bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Received event network-vif-plugged-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:15:21 compute-0 nova_compute[117514]: 2025-10-08 19:15:21.355 2 DEBUG oslo_concurrency.lockutils [req-48566298-102e-41f1-84cb-ba28998cbea0 req-8c223513-6006-41fc-b52a-f18526bd888c bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:15:21 compute-0 nova_compute[117514]: 2025-10-08 19:15:21.355 2 DEBUG oslo_concurrency.lockutils [req-48566298-102e-41f1-84cb-ba28998cbea0 req-8c223513-6006-41fc-b52a-f18526bd888c bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:15:21 compute-0 nova_compute[117514]: 2025-10-08 19:15:21.356 2 DEBUG oslo_concurrency.lockutils [req-48566298-102e-41f1-84cb-ba28998cbea0 req-8c223513-6006-41fc-b52a-f18526bd888c bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:15:21 compute-0 nova_compute[117514]: 2025-10-08 19:15:21.356 2 DEBUG nova.compute.manager [req-48566298-102e-41f1-84cb-ba28998cbea0 req-8c223513-6006-41fc-b52a-f18526bd888c bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] No waiting events found dispatching network-vif-plugged-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 08 19:15:21 compute-0 nova_compute[117514]: 2025-10-08 19:15:21.357 2 WARNING nova.compute.manager [req-48566298-102e-41f1-84cb-ba28998cbea0 req-8c223513-6006-41fc-b52a-f18526bd888c bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Received unexpected event network-vif-plugged-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 for instance with vm_state deleted and task_state None.
Oct 08 19:15:21 compute-0 nova_compute[117514]: 2025-10-08 19:15:21.357 2 DEBUG nova.compute.manager [req-48566298-102e-41f1-84cb-ba28998cbea0 req-8c223513-6006-41fc-b52a-f18526bd888c bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Received event network-vif-deleted-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:15:21 compute-0 nova_compute[117514]: 2025-10-08 19:15:21.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:21 compute-0 podman[150701]: 2025-10-08 19:15:21.689543543 +0000 UTC m=+0.095794074 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 08 19:15:22 compute-0 nova_compute[117514]: 2025-10-08 19:15:22.819 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:22 compute-0 nova_compute[117514]: 2025-10-08 19:15:22.939 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:24 compute-0 nova_compute[117514]: 2025-10-08 19:15:24.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:26 compute-0 nova_compute[117514]: 2025-10-08 19:15:26.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:29 compute-0 nova_compute[117514]: 2025-10-08 19:15:29.060 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:30 compute-0 nova_compute[117514]: 2025-10-08 19:15:30.967 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759950915.9657428, 2cd8a1e0-1eff-4f72-b839-340a50f3f21c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 08 19:15:30 compute-0 nova_compute[117514]: 2025-10-08 19:15:30.967 2 INFO nova.compute.manager [-] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] VM Stopped (Lifecycle Event)
Oct 08 19:15:30 compute-0 nova_compute[117514]: 2025-10-08 19:15:30.993 2 DEBUG nova.compute.manager [None req-732071df-72e8-413e-b3f7-43bc5c6d67cd - - - - - -] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:15:31 compute-0 nova_compute[117514]: 2025-10-08 19:15:31.406 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:31 compute-0 podman[150726]: 2025-10-08 19:15:31.713788796 +0000 UTC m=+0.120823875 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Oct 08 19:15:34 compute-0 nova_compute[117514]: 2025-10-08 19:15:34.035 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759950919.0330064, 5e004931-f1db-408c-9f7a-6c6c50c5f8ef => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 08 19:15:34 compute-0 nova_compute[117514]: 2025-10-08 19:15:34.036 2 INFO nova.compute.manager [-] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] VM Stopped (Lifecycle Event)
Oct 08 19:15:34 compute-0 nova_compute[117514]: 2025-10-08 19:15:34.062 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:34 compute-0 nova_compute[117514]: 2025-10-08 19:15:34.065 2 DEBUG nova.compute.manager [None req-31b0b051-8b49-4c51-b73f-02e5a245b715 - - - - - -] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:15:36 compute-0 nova_compute[117514]: 2025-10-08 19:15:36.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:37 compute-0 podman[150747]: 2025-10-08 19:15:37.684224384 +0000 UTC m=+0.084069545 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, managed_by=edpm_ansible)
Oct 08 19:15:37 compute-0 podman[150746]: 2025-10-08 19:15:37.684243324 +0000 UTC m=+0.098478900 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7)
Oct 08 19:15:37 compute-0 podman[150748]: 2025-10-08 19:15:37.698466754 +0000 UTC m=+0.101371314 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct 08 19:15:39 compute-0 nova_compute[117514]: 2025-10-08 19:15:39.064 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:40 compute-0 podman[150809]: 2025-10-08 19:15:40.648848894 +0000 UTC m=+0.053626898 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct 08 19:15:40 compute-0 podman[150807]: 2025-10-08 19:15:40.693590274 +0000 UTC m=+0.098239144 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct 08 19:15:40 compute-0 podman[150808]: 2025-10-08 19:15:40.767960179 +0000 UTC m=+0.165142094 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct 08 19:15:41 compute-0 nova_compute[117514]: 2025-10-08 19:15:41.410 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:41 compute-0 nova_compute[117514]: 2025-10-08 19:15:41.538 2 DEBUG oslo_concurrency.lockutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "34ca788d-2398-4a40-9f96-040c0849b18f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:15:41 compute-0 nova_compute[117514]: 2025-10-08 19:15:41.539 2 DEBUG oslo_concurrency.lockutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "34ca788d-2398-4a40-9f96-040c0849b18f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:15:41 compute-0 nova_compute[117514]: 2025-10-08 19:15:41.558 2 DEBUG nova.compute.manager [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Oct 08 19:15:41 compute-0 nova_compute[117514]: 2025-10-08 19:15:41.639 2 DEBUG oslo_concurrency.lockutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:15:41 compute-0 nova_compute[117514]: 2025-10-08 19:15:41.640 2 DEBUG oslo_concurrency.lockutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:15:41 compute-0 nova_compute[117514]: 2025-10-08 19:15:41.651 2 DEBUG nova.virt.hardware [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Oct 08 19:15:41 compute-0 nova_compute[117514]: 2025-10-08 19:15:41.651 2 INFO nova.compute.claims [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Claim successful on node compute-0.ctlplane.example.com
Oct 08 19:15:41 compute-0 nova_compute[117514]: 2025-10-08 19:15:41.788 2 DEBUG nova.compute.provider_tree [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 08 19:15:41 compute-0 nova_compute[117514]: 2025-10-08 19:15:41.808 2 DEBUG nova.scheduler.client.report [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 08 19:15:41 compute-0 nova_compute[117514]: 2025-10-08 19:15:41.832 2 DEBUG oslo_concurrency.lockutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.192s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:15:41 compute-0 nova_compute[117514]: 2025-10-08 19:15:41.833 2 DEBUG nova.compute.manager [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Oct 08 19:15:41 compute-0 nova_compute[117514]: 2025-10-08 19:15:41.887 2 DEBUG nova.compute.manager [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Oct 08 19:15:41 compute-0 nova_compute[117514]: 2025-10-08 19:15:41.888 2 DEBUG nova.network.neutron [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Oct 08 19:15:41 compute-0 nova_compute[117514]: 2025-10-08 19:15:41.907 2 INFO nova.virt.libvirt.driver [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 08 19:15:41 compute-0 nova_compute[117514]: 2025-10-08 19:15:41.928 2 DEBUG nova.compute.manager [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Oct 08 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.033 2 DEBUG nova.compute.manager [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Oct 08 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.035 2 DEBUG nova.virt.libvirt.driver [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Oct 08 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.036 2 INFO nova.virt.libvirt.driver [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Creating image(s)
Oct 08 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.037 2 DEBUG oslo_concurrency.lockutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "/var/lib/nova/instances/34ca788d-2398-4a40-9f96-040c0849b18f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.038 2 DEBUG oslo_concurrency.lockutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "/var/lib/nova/instances/34ca788d-2398-4a40-9f96-040c0849b18f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.039 2 DEBUG oslo_concurrency.lockutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "/var/lib/nova/instances/34ca788d-2398-4a40-9f96-040c0849b18f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.061 2 DEBUG oslo_concurrency.processutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.149 2 DEBUG oslo_concurrency.processutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.150 2 DEBUG oslo_concurrency.lockutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "008eb3078b811ee47058b7252a820910c35fc6df" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.151 2 DEBUG oslo_concurrency.lockutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.173 2 DEBUG oslo_concurrency.processutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.230 2 DEBUG nova.policy [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Oct 08 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.253 2 DEBUG oslo_concurrency.processutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.254 2 DEBUG oslo_concurrency.processutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df,backing_fmt=raw /var/lib/nova/instances/34ca788d-2398-4a40-9f96-040c0849b18f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.296 2 DEBUG oslo_concurrency.processutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df,backing_fmt=raw /var/lib/nova/instances/34ca788d-2398-4a40-9f96-040c0849b18f/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.297 2 DEBUG oslo_concurrency.lockutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.146s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.298 2 DEBUG oslo_concurrency.processutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.379 2 DEBUG oslo_concurrency.processutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.381 2 DEBUG nova.virt.disk.api [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Checking if we can resize image /var/lib/nova/instances/34ca788d-2398-4a40-9f96-040c0849b18f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Oct 08 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.382 2 DEBUG oslo_concurrency.processutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/34ca788d-2398-4a40-9f96-040c0849b18f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.476 2 DEBUG oslo_concurrency.processutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/34ca788d-2398-4a40-9f96-040c0849b18f/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.478 2 DEBUG nova.virt.disk.api [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Cannot resize image /var/lib/nova/instances/34ca788d-2398-4a40-9f96-040c0849b18f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Oct 08 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.479 2 DEBUG nova.objects.instance [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'migration_context' on Instance uuid 34ca788d-2398-4a40-9f96-040c0849b18f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 08 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.493 2 DEBUG nova.virt.libvirt.driver [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Oct 08 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.494 2 DEBUG nova.virt.libvirt.driver [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Ensure instance console log exists: /var/lib/nova/instances/34ca788d-2398-4a40-9f96-040c0849b18f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Oct 08 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.494 2 DEBUG oslo_concurrency.lockutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.495 2 DEBUG oslo_concurrency.lockutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.495 2 DEBUG oslo_concurrency.lockutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:15:43 compute-0 nova_compute[117514]: 2025-10-08 19:15:43.324 2 DEBUG nova.network.neutron [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Successfully created port: 06998e1e-8ce7-484d-b3e4-7d44699229c4 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.003 2 DEBUG nova.network.neutron [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Successfully updated port: 06998e1e-8ce7-484d-b3e4-7d44699229c4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.028 2 DEBUG oslo_concurrency.lockutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "refresh_cache-34ca788d-2398-4a40-9f96-040c0849b18f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.029 2 DEBUG oslo_concurrency.lockutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquired lock "refresh_cache-34ca788d-2398-4a40-9f96-040c0849b18f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.029 2 DEBUG nova.network.neutron [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.066 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.100 2 DEBUG nova.compute.manager [req-7046266a-73fc-4410-b39a-7bc6485e7f66 req-1e78eae3-271a-4812-8323-eab30103ec5d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Received event network-changed-06998e1e-8ce7-484d-b3e4-7d44699229c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.101 2 DEBUG nova.compute.manager [req-7046266a-73fc-4410-b39a-7bc6485e7f66 req-1e78eae3-271a-4812-8323-eab30103ec5d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Refreshing instance network info cache due to event network-changed-06998e1e-8ce7-484d-b3e4-7d44699229c4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.102 2 DEBUG oslo_concurrency.lockutils [req-7046266a-73fc-4410-b39a-7bc6485e7f66 req-1e78eae3-271a-4812-8323-eab30103ec5d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-34ca788d-2398-4a40-9f96-040c0849b18f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.198 2 DEBUG nova.network.neutron [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Oct 08 19:15:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:44.235 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:15:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:44.236 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:15:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:44.236 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.882 2 DEBUG nova.network.neutron [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Updating instance_info_cache with network_info: [{"id": "06998e1e-8ce7-484d-b3e4-7d44699229c4", "address": "fa:16:3e:66:c0:df", "network": {"id": "ed492f30-88ab-4074-a37b-2efd9113a46f", "bridge": "br-int", "label": "tempest-network-smoke--1871519036", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06998e1e-8c", "ovs_interfaceid": "06998e1e-8ce7-484d-b3e4-7d44699229c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.906 2 DEBUG oslo_concurrency.lockutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Releasing lock "refresh_cache-34ca788d-2398-4a40-9f96-040c0849b18f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.906 2 DEBUG nova.compute.manager [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Instance network_info: |[{"id": "06998e1e-8ce7-484d-b3e4-7d44699229c4", "address": "fa:16:3e:66:c0:df", "network": {"id": "ed492f30-88ab-4074-a37b-2efd9113a46f", "bridge": "br-int", "label": "tempest-network-smoke--1871519036", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06998e1e-8c", "ovs_interfaceid": "06998e1e-8ce7-484d-b3e4-7d44699229c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.907 2 DEBUG oslo_concurrency.lockutils [req-7046266a-73fc-4410-b39a-7bc6485e7f66 req-1e78eae3-271a-4812-8323-eab30103ec5d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-34ca788d-2398-4a40-9f96-040c0849b18f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.907 2 DEBUG nova.network.neutron [req-7046266a-73fc-4410-b39a-7bc6485e7f66 req-1e78eae3-271a-4812-8323-eab30103ec5d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Refreshing network info cache for port 06998e1e-8ce7-484d-b3e4-7d44699229c4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.912 2 DEBUG nova.virt.libvirt.driver [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Start _get_guest_xml network_info=[{"id": "06998e1e-8ce7-484d-b3e4-7d44699229c4", "address": "fa:16:3e:66:c0:df", "network": {"id": "ed492f30-88ab-4074-a37b-2efd9113a46f", "bridge": "br-int", "label": "tempest-network-smoke--1871519036", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06998e1e-8c", "ovs_interfaceid": "06998e1e-8ce7-484d-b3e4-7d44699229c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T19:05:11Z,direct_url=<?>,disk_format='qcow2',id=23cfa426-7011-4566-992d-1c7af39f70dd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0776a2a010754884a7b224f3b08ef53b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T19:05:12Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'guest_format': None, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_options': None, 'image_id': '23cfa426-7011-4566-992d-1c7af39f70dd'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.918 2 WARNING nova.virt.libvirt.driver [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.924 2 DEBUG nova.virt.libvirt.host [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.925 2 DEBUG nova.virt.libvirt.host [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.936 2 DEBUG nova.virt.libvirt.host [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.937 2 DEBUG nova.virt.libvirt.host [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.938 2 DEBUG nova.virt.libvirt.driver [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.939 2 DEBUG nova.virt.hardware [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T19:05:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='e8a148fc-4419-4813-98ff-a17e2a95609e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T19:05:11Z,direct_url=<?>,disk_format='qcow2',id=23cfa426-7011-4566-992d-1c7af39f70dd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0776a2a010754884a7b224f3b08ef53b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T19:05:12Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.940 2 DEBUG nova.virt.hardware [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.941 2 DEBUG nova.virt.hardware [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.941 2 DEBUG nova.virt.hardware [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.942 2 DEBUG nova.virt.hardware [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.942 2 DEBUG nova.virt.hardware [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.943 2 DEBUG nova.virt.hardware [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.943 2 DEBUG nova.virt.hardware [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.944 2 DEBUG nova.virt.hardware [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.944 2 DEBUG nova.virt.hardware [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.945 2 DEBUG nova.virt.hardware [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.951 2 DEBUG nova.virt.libvirt.vif [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T19:15:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-91776509',display_name='tempest-TestNetworkBasicOps-server-91776509',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-91776509',id=13,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI8PTrGv1QybFIubtsg8lczGea0IvQL8pvhihemAZSj0UMnf1scRH00KmJvAMVhcwpSfJBSsSB9h8z57cU6NeYho/jEOEiMidDlTZU4qxsLiPufykBInXUSkP3hGqOiJaw==',key_name='tempest-TestNetworkBasicOps-916834063',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-q8bisj0t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T19:15:41Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=34ca788d-2398-4a40-9f96-040c0849b18f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "06998e1e-8ce7-484d-b3e4-7d44699229c4", "address": "fa:16:3e:66:c0:df", "network": {"id": "ed492f30-88ab-4074-a37b-2efd9113a46f", "bridge": "br-int", "label": "tempest-network-smoke--1871519036", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06998e1e-8c", "ovs_interfaceid": "06998e1e-8ce7-484d-b3e4-7d44699229c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.951 2 DEBUG nova.network.os_vif_util [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "06998e1e-8ce7-484d-b3e4-7d44699229c4", "address": "fa:16:3e:66:c0:df", "network": {"id": "ed492f30-88ab-4074-a37b-2efd9113a46f", "bridge": "br-int", "label": "tempest-network-smoke--1871519036", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06998e1e-8c", "ovs_interfaceid": "06998e1e-8ce7-484d-b3e4-7d44699229c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.953 2 DEBUG nova.network.os_vif_util [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:66:c0:df,bridge_name='br-int',has_traffic_filtering=True,id=06998e1e-8ce7-484d-b3e4-7d44699229c4,network=Network(ed492f30-88ab-4074-a37b-2efd9113a46f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06998e1e-8c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.955 2 DEBUG nova.objects.instance [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 34ca788d-2398-4a40-9f96-040c0849b18f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.969 2 DEBUG nova.virt.libvirt.driver [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] End _get_guest_xml xml=<domain type="kvm">
Oct 08 19:15:44 compute-0 nova_compute[117514]:   <uuid>34ca788d-2398-4a40-9f96-040c0849b18f</uuid>
Oct 08 19:15:44 compute-0 nova_compute[117514]:   <name>instance-0000000d</name>
Oct 08 19:15:44 compute-0 nova_compute[117514]:   <memory>131072</memory>
Oct 08 19:15:44 compute-0 nova_compute[117514]:   <vcpu>1</vcpu>
Oct 08 19:15:44 compute-0 nova_compute[117514]:   <metadata>
Oct 08 19:15:44 compute-0 nova_compute[117514]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 08 19:15:44 compute-0 nova_compute[117514]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct 08 19:15:44 compute-0 nova_compute[117514]:       <nova:name>tempest-TestNetworkBasicOps-server-91776509</nova:name>
Oct 08 19:15:44 compute-0 nova_compute[117514]:       <nova:creationTime>2025-10-08 19:15:44</nova:creationTime>
Oct 08 19:15:44 compute-0 nova_compute[117514]:       <nova:flavor name="m1.nano">
Oct 08 19:15:44 compute-0 nova_compute[117514]:         <nova:memory>128</nova:memory>
Oct 08 19:15:44 compute-0 nova_compute[117514]:         <nova:disk>1</nova:disk>
Oct 08 19:15:44 compute-0 nova_compute[117514]:         <nova:swap>0</nova:swap>
Oct 08 19:15:44 compute-0 nova_compute[117514]:         <nova:ephemeral>0</nova:ephemeral>
Oct 08 19:15:44 compute-0 nova_compute[117514]:         <nova:vcpus>1</nova:vcpus>
Oct 08 19:15:44 compute-0 nova_compute[117514]:       </nova:flavor>
Oct 08 19:15:44 compute-0 nova_compute[117514]:       <nova:owner>
Oct 08 19:15:44 compute-0 nova_compute[117514]:         <nova:user uuid="efdb1424acdb478684cdb088b373ba05">tempest-TestNetworkBasicOps-1122149477-project-member</nova:user>
Oct 08 19:15:44 compute-0 nova_compute[117514]:         <nova:project uuid="b7f7c752a9c5498f8eda73e461895ac9">tempest-TestNetworkBasicOps-1122149477</nova:project>
Oct 08 19:15:44 compute-0 nova_compute[117514]:       </nova:owner>
Oct 08 19:15:44 compute-0 nova_compute[117514]:       <nova:root type="image" uuid="23cfa426-7011-4566-992d-1c7af39f70dd"/>
Oct 08 19:15:44 compute-0 nova_compute[117514]:       <nova:ports>
Oct 08 19:15:44 compute-0 nova_compute[117514]:         <nova:port uuid="06998e1e-8ce7-484d-b3e4-7d44699229c4">
Oct 08 19:15:44 compute-0 nova_compute[117514]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct 08 19:15:44 compute-0 nova_compute[117514]:         </nova:port>
Oct 08 19:15:44 compute-0 nova_compute[117514]:       </nova:ports>
Oct 08 19:15:44 compute-0 nova_compute[117514]:     </nova:instance>
Oct 08 19:15:44 compute-0 nova_compute[117514]:   </metadata>
Oct 08 19:15:44 compute-0 nova_compute[117514]:   <sysinfo type="smbios">
Oct 08 19:15:44 compute-0 nova_compute[117514]:     <system>
Oct 08 19:15:44 compute-0 nova_compute[117514]:       <entry name="manufacturer">RDO</entry>
Oct 08 19:15:44 compute-0 nova_compute[117514]:       <entry name="product">OpenStack Compute</entry>
Oct 08 19:15:44 compute-0 nova_compute[117514]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct 08 19:15:44 compute-0 nova_compute[117514]:       <entry name="serial">34ca788d-2398-4a40-9f96-040c0849b18f</entry>
Oct 08 19:15:44 compute-0 nova_compute[117514]:       <entry name="uuid">34ca788d-2398-4a40-9f96-040c0849b18f</entry>
Oct 08 19:15:44 compute-0 nova_compute[117514]:       <entry name="family">Virtual Machine</entry>
Oct 08 19:15:44 compute-0 nova_compute[117514]:     </system>
Oct 08 19:15:44 compute-0 nova_compute[117514]:   </sysinfo>
Oct 08 19:15:44 compute-0 nova_compute[117514]:   <os>
Oct 08 19:15:44 compute-0 nova_compute[117514]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 08 19:15:44 compute-0 nova_compute[117514]:     <boot dev="hd"/>
Oct 08 19:15:44 compute-0 nova_compute[117514]:     <smbios mode="sysinfo"/>
Oct 08 19:15:44 compute-0 nova_compute[117514]:   </os>
Oct 08 19:15:44 compute-0 nova_compute[117514]:   <features>
Oct 08 19:15:44 compute-0 nova_compute[117514]:     <acpi/>
Oct 08 19:15:44 compute-0 nova_compute[117514]:     <apic/>
Oct 08 19:15:44 compute-0 nova_compute[117514]:     <vmcoreinfo/>
Oct 08 19:15:44 compute-0 nova_compute[117514]:   </features>
Oct 08 19:15:44 compute-0 nova_compute[117514]:   <clock offset="utc">
Oct 08 19:15:44 compute-0 nova_compute[117514]:     <timer name="pit" tickpolicy="delay"/>
Oct 08 19:15:44 compute-0 nova_compute[117514]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 08 19:15:44 compute-0 nova_compute[117514]:     <timer name="hpet" present="no"/>
Oct 08 19:15:44 compute-0 nova_compute[117514]:   </clock>
Oct 08 19:15:44 compute-0 nova_compute[117514]:   <cpu mode="host-model" match="exact">
Oct 08 19:15:44 compute-0 nova_compute[117514]:     <topology sockets="1" cores="1" threads="1"/>
Oct 08 19:15:44 compute-0 nova_compute[117514]:   </cpu>
Oct 08 19:15:44 compute-0 nova_compute[117514]:   <devices>
Oct 08 19:15:44 compute-0 nova_compute[117514]:     <disk type="file" device="disk">
Oct 08 19:15:44 compute-0 nova_compute[117514]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 08 19:15:44 compute-0 nova_compute[117514]:       <source file="/var/lib/nova/instances/34ca788d-2398-4a40-9f96-040c0849b18f/disk"/>
Oct 08 19:15:44 compute-0 nova_compute[117514]:       <target dev="vda" bus="virtio"/>
Oct 08 19:15:44 compute-0 nova_compute[117514]:     </disk>
Oct 08 19:15:44 compute-0 nova_compute[117514]:     <disk type="file" device="cdrom">
Oct 08 19:15:44 compute-0 nova_compute[117514]:       <driver name="qemu" type="raw" cache="none"/>
Oct 08 19:15:44 compute-0 nova_compute[117514]:       <source file="/var/lib/nova/instances/34ca788d-2398-4a40-9f96-040c0849b18f/disk.config"/>
Oct 08 19:15:44 compute-0 nova_compute[117514]:       <target dev="sda" bus="sata"/>
Oct 08 19:15:44 compute-0 nova_compute[117514]:     </disk>
Oct 08 19:15:44 compute-0 nova_compute[117514]:     <interface type="ethernet">
Oct 08 19:15:44 compute-0 nova_compute[117514]:       <mac address="fa:16:3e:66:c0:df"/>
Oct 08 19:15:44 compute-0 nova_compute[117514]:       <model type="virtio"/>
Oct 08 19:15:44 compute-0 nova_compute[117514]:       <driver name="vhost" rx_queue_size="512"/>
Oct 08 19:15:44 compute-0 nova_compute[117514]:       <mtu size="1442"/>
Oct 08 19:15:44 compute-0 nova_compute[117514]:       <target dev="tap06998e1e-8c"/>
Oct 08 19:15:44 compute-0 nova_compute[117514]:     </interface>
Oct 08 19:15:44 compute-0 nova_compute[117514]:     <serial type="pty">
Oct 08 19:15:44 compute-0 nova_compute[117514]:       <log file="/var/lib/nova/instances/34ca788d-2398-4a40-9f96-040c0849b18f/console.log" append="off"/>
Oct 08 19:15:44 compute-0 nova_compute[117514]:     </serial>
Oct 08 19:15:44 compute-0 nova_compute[117514]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 08 19:15:44 compute-0 nova_compute[117514]:     <video>
Oct 08 19:15:44 compute-0 nova_compute[117514]:       <model type="virtio"/>
Oct 08 19:15:44 compute-0 nova_compute[117514]:     </video>
Oct 08 19:15:44 compute-0 nova_compute[117514]:     <input type="tablet" bus="usb"/>
Oct 08 19:15:44 compute-0 nova_compute[117514]:     <rng model="virtio">
Oct 08 19:15:44 compute-0 nova_compute[117514]:       <backend model="random">/dev/urandom</backend>
Oct 08 19:15:44 compute-0 nova_compute[117514]:     </rng>
Oct 08 19:15:44 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root"/>
Oct 08 19:15:44 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:15:44 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:15:44 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:15:44 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:15:44 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:15:44 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:15:44 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:15:44 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:15:44 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:15:44 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:15:44 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:15:44 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:15:44 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:15:44 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:15:44 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:15:44 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:15:44 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:15:44 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:15:44 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:15:44 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:15:44 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:15:44 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:15:44 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:15:44 compute-0 nova_compute[117514]:     <controller type="pci" model="pcie-root-port"/>
Oct 08 19:15:44 compute-0 nova_compute[117514]:     <controller type="usb" index="0"/>
Oct 08 19:15:44 compute-0 nova_compute[117514]:     <memballoon model="virtio">
Oct 08 19:15:44 compute-0 nova_compute[117514]:       <stats period="10"/>
Oct 08 19:15:44 compute-0 nova_compute[117514]:     </memballoon>
Oct 08 19:15:44 compute-0 nova_compute[117514]:   </devices>
Oct 08 19:15:44 compute-0 nova_compute[117514]: </domain>
Oct 08 19:15:44 compute-0 nova_compute[117514]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.971 2 DEBUG nova.compute.manager [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Preparing to wait for external event network-vif-plugged-06998e1e-8ce7-484d-b3e4-7d44699229c4 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.972 2 DEBUG oslo_concurrency.lockutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "34ca788d-2398-4a40-9f96-040c0849b18f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.972 2 DEBUG oslo_concurrency.lockutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "34ca788d-2398-4a40-9f96-040c0849b18f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.972 2 DEBUG oslo_concurrency.lockutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "34ca788d-2398-4a40-9f96-040c0849b18f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.973 2 DEBUG nova.virt.libvirt.vif [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T19:15:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-91776509',display_name='tempest-TestNetworkBasicOps-server-91776509',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-91776509',id=13,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI8PTrGv1QybFIubtsg8lczGea0IvQL8pvhihemAZSj0UMnf1scRH00KmJvAMVhcwpSfJBSsSB9h8z57cU6NeYho/jEOEiMidDlTZU4qxsLiPufykBInXUSkP3hGqOiJaw==',key_name='tempest-TestNetworkBasicOps-916834063',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-q8bisj0t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T19:15:41Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=34ca788d-2398-4a40-9f96-040c0849b18f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "06998e1e-8ce7-484d-b3e4-7d44699229c4", "address": "fa:16:3e:66:c0:df", "network": {"id": "ed492f30-88ab-4074-a37b-2efd9113a46f", "bridge": "br-int", "label": "tempest-network-smoke--1871519036", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06998e1e-8c", "ovs_interfaceid": "06998e1e-8ce7-484d-b3e4-7d44699229c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.973 2 DEBUG nova.network.os_vif_util [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "06998e1e-8ce7-484d-b3e4-7d44699229c4", "address": "fa:16:3e:66:c0:df", "network": {"id": "ed492f30-88ab-4074-a37b-2efd9113a46f", "bridge": "br-int", "label": "tempest-network-smoke--1871519036", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06998e1e-8c", "ovs_interfaceid": "06998e1e-8ce7-484d-b3e4-7d44699229c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.974 2 DEBUG nova.network.os_vif_util [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:66:c0:df,bridge_name='br-int',has_traffic_filtering=True,id=06998e1e-8ce7-484d-b3e4-7d44699229c4,network=Network(ed492f30-88ab-4074-a37b-2efd9113a46f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06998e1e-8c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.974 2 DEBUG os_vif [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:66:c0:df,bridge_name='br-int',has_traffic_filtering=True,id=06998e1e-8ce7-484d-b3e4-7d44699229c4,network=Network(ed492f30-88ab-4074-a37b-2efd9113a46f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06998e1e-8c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.975 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.975 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.976 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.982 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap06998e1e-8c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.983 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap06998e1e-8c, col_values=(('external_ids', {'iface-id': '06998e1e-8ce7-484d-b3e4-7d44699229c4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:66:c0:df', 'vm-uuid': '34ca788d-2398-4a40-9f96-040c0849b18f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:15:44 compute-0 NetworkManager[1035]: <info>  [1759950944.9873] manager: (tap06998e1e-8c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/89)
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.994 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.995 2 INFO os_vif [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:66:c0:df,bridge_name='br-int',has_traffic_filtering=True,id=06998e1e-8ce7-484d-b3e4-7d44699229c4,network=Network(ed492f30-88ab-4074-a37b-2efd9113a46f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06998e1e-8c')
Oct 08 19:15:45 compute-0 nova_compute[117514]: 2025-10-08 19:15:45.052 2 DEBUG nova.virt.libvirt.driver [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 08 19:15:45 compute-0 nova_compute[117514]: 2025-10-08 19:15:45.053 2 DEBUG nova.virt.libvirt.driver [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 08 19:15:45 compute-0 nova_compute[117514]: 2025-10-08 19:15:45.053 2 DEBUG nova.virt.libvirt.driver [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No VIF found with MAC fa:16:3e:66:c0:df, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Oct 08 19:15:45 compute-0 nova_compute[117514]: 2025-10-08 19:15:45.054 2 INFO nova.virt.libvirt.driver [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Using config drive
Oct 08 19:15:45 compute-0 nova_compute[117514]: 2025-10-08 19:15:45.376 2 INFO nova.virt.libvirt.driver [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Creating config drive at /var/lib/nova/instances/34ca788d-2398-4a40-9f96-040c0849b18f/disk.config
Oct 08 19:15:45 compute-0 nova_compute[117514]: 2025-10-08 19:15:45.380 2 DEBUG oslo_concurrency.processutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/34ca788d-2398-4a40-9f96-040c0849b18f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5gl3_4fm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:15:45 compute-0 nova_compute[117514]: 2025-10-08 19:15:45.509 2 DEBUG oslo_concurrency.processutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/34ca788d-2398-4a40-9f96-040c0849b18f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5gl3_4fm" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:15:45 compute-0 kernel: tap06998e1e-8c: entered promiscuous mode
Oct 08 19:15:45 compute-0 NetworkManager[1035]: <info>  [1759950945.5844] manager: (tap06998e1e-8c): new Tun device (/org/freedesktop/NetworkManager/Devices/90)
Oct 08 19:15:45 compute-0 ovn_controller[19759]: 2025-10-08T19:15:45Z|00154|binding|INFO|Claiming lport 06998e1e-8ce7-484d-b3e4-7d44699229c4 for this chassis.
Oct 08 19:15:45 compute-0 ovn_controller[19759]: 2025-10-08T19:15:45Z|00155|binding|INFO|06998e1e-8ce7-484d-b3e4-7d44699229c4: Claiming fa:16:3e:66:c0:df 10.100.0.6
Oct 08 19:15:45 compute-0 nova_compute[117514]: 2025-10-08 19:15:45.588 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:45.596 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:66:c0:df 10.100.0.6'], port_security=['fa:16:3e:66:c0:df 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ed492f30-88ab-4074-a37b-2efd9113a46f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0aa04153-3da7-40f5-b74d-f2ebacf56fd3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d714e95d-17df-46e0-aa89-985c7cbd12a3, chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=06998e1e-8ce7-484d-b3e4-7d44699229c4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 08 19:15:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:45.598 28643 INFO neutron.agent.ovn.metadata.agent [-] Port 06998e1e-8ce7-484d-b3e4-7d44699229c4 in datapath ed492f30-88ab-4074-a37b-2efd9113a46f bound to our chassis
Oct 08 19:15:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:45.598 28643 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ed492f30-88ab-4074-a37b-2efd9113a46f
Oct 08 19:15:45 compute-0 systemd-udevd[150905]: Network interface NamePolicy= disabled on kernel command line.
Oct 08 19:15:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:45.614 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[15079f8c-5e21-4d61-93e8-0c60ca29032e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:15:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:45.615 28643 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH taped492f30-81 in ovnmeta-ed492f30-88ab-4074-a37b-2efd9113a46f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Oct 08 19:15:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:45.617 144726 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface taped492f30-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Oct 08 19:15:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:45.618 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[f978ecb8-8be6-4ae0-b63e-4fd0af4996c4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:15:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:45.619 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[e83eda28-f06c-4025-bbe3-c48ff2fb6bca]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:15:45 compute-0 NetworkManager[1035]: <info>  [1759950945.6278] device (tap06998e1e-8c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 08 19:15:45 compute-0 NetworkManager[1035]: <info>  [1759950945.6287] device (tap06998e1e-8c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 08 19:15:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:45.631 28783 DEBUG oslo.privsep.daemon [-] privsep: reply[199b563c-4f4d-4804-8657-13bf37b87572]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:15:45 compute-0 systemd-machined[77568]: New machine qemu-13-instance-0000000d.
Oct 08 19:15:45 compute-0 nova_compute[117514]: 2025-10-08 19:15:45.642 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:45 compute-0 ovn_controller[19759]: 2025-10-08T19:15:45Z|00156|binding|INFO|Setting lport 06998e1e-8ce7-484d-b3e4-7d44699229c4 ovn-installed in OVS
Oct 08 19:15:45 compute-0 ovn_controller[19759]: 2025-10-08T19:15:45Z|00157|binding|INFO|Setting lport 06998e1e-8ce7-484d-b3e4-7d44699229c4 up in Southbound
Oct 08 19:15:45 compute-0 nova_compute[117514]: 2025-10-08 19:15:45.648 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:45 compute-0 systemd[1]: Started Virtual Machine qemu-13-instance-0000000d.
Oct 08 19:15:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:45.661 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[9dac3c58-0591-4f86-be5b-603c37d49423]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:15:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:45.689 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[6387f2fe-ba50-43d1-9fb6-4781c043af30]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:15:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:45.696 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[f2c43320-beb2-4d3f-b834-607cb293eece]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:15:45 compute-0 NetworkManager[1035]: <info>  [1759950945.6975] manager: (taped492f30-80): new Veth device (/org/freedesktop/NetworkManager/Devices/91)
Oct 08 19:15:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:45.738 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[8965ab01-6495-4048-a5a1-841389b5687d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:15:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:45.742 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[d6d9ab24-fd50-4c71-b6d1-9f6b8912041e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:15:45 compute-0 NetworkManager[1035]: <info>  [1759950945.7731] device (taped492f30-80): carrier: link connected
Oct 08 19:15:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:45.780 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[fc96d55b-13ad-42f6-a435-773fd67ce923]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:15:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:45.802 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[ed5f33b7-0332-4ba6-a70f-1157cc5dc7c8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'taped492f30-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ad:6a:4b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 55], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 161635, 'reachable_time': 15437, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 150938, 'error': None, 'target': 'ovnmeta-ed492f30-88ab-4074-a37b-2efd9113a46f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:15:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:45.826 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[761217d0-2e59-42e6-b9c4-92ca903dd05d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fead:6a4b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 161635, 'tstamp': 161635}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 150939, 'error': None, 'target': 'ovnmeta-ed492f30-88ab-4074-a37b-2efd9113a46f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:15:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:45.846 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[a13a8510-7330-403b-8a36-8b465c9699e2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'taped492f30-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ad:6a:4b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 55], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 161635, 'reachable_time': 15437, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 150940, 'error': None, 'target': 'ovnmeta-ed492f30-88ab-4074-a37b-2efd9113a46f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:15:45 compute-0 nova_compute[117514]: 2025-10-08 19:15:45.865 2 DEBUG nova.compute.manager [req-2a1799c4-0291-4e7e-a2f1-31a7ce77551a req-09d5a077-4d9d-4cbb-beb6-cd3a59414b2d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Received event network-vif-plugged-06998e1e-8ce7-484d-b3e4-7d44699229c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:15:45 compute-0 nova_compute[117514]: 2025-10-08 19:15:45.866 2 DEBUG oslo_concurrency.lockutils [req-2a1799c4-0291-4e7e-a2f1-31a7ce77551a req-09d5a077-4d9d-4cbb-beb6-cd3a59414b2d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "34ca788d-2398-4a40-9f96-040c0849b18f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:15:45 compute-0 nova_compute[117514]: 2025-10-08 19:15:45.867 2 DEBUG oslo_concurrency.lockutils [req-2a1799c4-0291-4e7e-a2f1-31a7ce77551a req-09d5a077-4d9d-4cbb-beb6-cd3a59414b2d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "34ca788d-2398-4a40-9f96-040c0849b18f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:15:45 compute-0 nova_compute[117514]: 2025-10-08 19:15:45.867 2 DEBUG oslo_concurrency.lockutils [req-2a1799c4-0291-4e7e-a2f1-31a7ce77551a req-09d5a077-4d9d-4cbb-beb6-cd3a59414b2d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "34ca788d-2398-4a40-9f96-040c0849b18f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:15:45 compute-0 nova_compute[117514]: 2025-10-08 19:15:45.868 2 DEBUG nova.compute.manager [req-2a1799c4-0291-4e7e-a2f1-31a7ce77551a req-09d5a077-4d9d-4cbb-beb6-cd3a59414b2d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Processing event network-vif-plugged-06998e1e-8ce7-484d-b3e4-7d44699229c4 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Oct 08 19:15:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:45.890 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[aa30b568-56de-4920-9b74-6b3f82f018d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:15:45 compute-0 nova_compute[117514]: 2025-10-08 19:15:45.926 2 DEBUG nova.network.neutron [req-7046266a-73fc-4410-b39a-7bc6485e7f66 req-1e78eae3-271a-4812-8323-eab30103ec5d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Updated VIF entry in instance network info cache for port 06998e1e-8ce7-484d-b3e4-7d44699229c4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 08 19:15:45 compute-0 nova_compute[117514]: 2025-10-08 19:15:45.927 2 DEBUG nova.network.neutron [req-7046266a-73fc-4410-b39a-7bc6485e7f66 req-1e78eae3-271a-4812-8323-eab30103ec5d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Updating instance_info_cache with network_info: [{"id": "06998e1e-8ce7-484d-b3e4-7d44699229c4", "address": "fa:16:3e:66:c0:df", "network": {"id": "ed492f30-88ab-4074-a37b-2efd9113a46f", "bridge": "br-int", "label": "tempest-network-smoke--1871519036", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06998e1e-8c", "ovs_interfaceid": "06998e1e-8ce7-484d-b3e4-7d44699229c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:15:45 compute-0 nova_compute[117514]: 2025-10-08 19:15:45.956 2 DEBUG oslo_concurrency.lockutils [req-7046266a-73fc-4410-b39a-7bc6485e7f66 req-1e78eae3-271a-4812-8323-eab30103ec5d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-34ca788d-2398-4a40-9f96-040c0849b18f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:15:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:45.983 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[39dff74f-650d-4dda-a2e6-f61d55573330]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:15:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:45.985 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=taped492f30-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:15:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:45.986 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 08 19:15:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:45.987 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=taped492f30-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:15:45 compute-0 kernel: taped492f30-80: entered promiscuous mode
Oct 08 19:15:45 compute-0 NetworkManager[1035]: <info>  [1759950945.9904] manager: (taped492f30-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/92)
Oct 08 19:15:45 compute-0 nova_compute[117514]: 2025-10-08 19:15:45.992 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:45.993 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=taped492f30-80, col_values=(('external_ids', {'iface-id': '58bfd3a1-f863-472a-ae8b-afc52524c7cc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:15:45 compute-0 ovn_controller[19759]: 2025-10-08T19:15:45Z|00158|binding|INFO|Releasing lport 58bfd3a1-f863-472a-ae8b-afc52524c7cc from this chassis (sb_readonly=0)
Oct 08 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.021 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:46 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:46.022 28643 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ed492f30-88ab-4074-a37b-2efd9113a46f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ed492f30-88ab-4074-a37b-2efd9113a46f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Oct 08 19:15:46 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:46.023 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[69485668-5361-4546-abad-5f9b309fc1c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:15:46 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:46.024 28643 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct 08 19:15:46 compute-0 ovn_metadata_agent[28637]: global
Oct 08 19:15:46 compute-0 ovn_metadata_agent[28637]:     log         /dev/log local0 debug
Oct 08 19:15:46 compute-0 ovn_metadata_agent[28637]:     log-tag     haproxy-metadata-proxy-ed492f30-88ab-4074-a37b-2efd9113a46f
Oct 08 19:15:46 compute-0 ovn_metadata_agent[28637]:     user        root
Oct 08 19:15:46 compute-0 ovn_metadata_agent[28637]:     group       root
Oct 08 19:15:46 compute-0 ovn_metadata_agent[28637]:     maxconn     1024
Oct 08 19:15:46 compute-0 ovn_metadata_agent[28637]:     pidfile     /var/lib/neutron/external/pids/ed492f30-88ab-4074-a37b-2efd9113a46f.pid.haproxy
Oct 08 19:15:46 compute-0 ovn_metadata_agent[28637]:     daemon
Oct 08 19:15:46 compute-0 ovn_metadata_agent[28637]: 
Oct 08 19:15:46 compute-0 ovn_metadata_agent[28637]: defaults
Oct 08 19:15:46 compute-0 ovn_metadata_agent[28637]:     log global
Oct 08 19:15:46 compute-0 ovn_metadata_agent[28637]:     mode http
Oct 08 19:15:46 compute-0 ovn_metadata_agent[28637]:     option httplog
Oct 08 19:15:46 compute-0 ovn_metadata_agent[28637]:     option dontlognull
Oct 08 19:15:46 compute-0 ovn_metadata_agent[28637]:     option http-server-close
Oct 08 19:15:46 compute-0 ovn_metadata_agent[28637]:     option forwardfor
Oct 08 19:15:46 compute-0 ovn_metadata_agent[28637]:     retries                 3
Oct 08 19:15:46 compute-0 ovn_metadata_agent[28637]:     timeout http-request    30s
Oct 08 19:15:46 compute-0 ovn_metadata_agent[28637]:     timeout connect         30s
Oct 08 19:15:46 compute-0 ovn_metadata_agent[28637]:     timeout client          32s
Oct 08 19:15:46 compute-0 ovn_metadata_agent[28637]:     timeout server          32s
Oct 08 19:15:46 compute-0 ovn_metadata_agent[28637]:     timeout http-keep-alive 30s
Oct 08 19:15:46 compute-0 ovn_metadata_agent[28637]: 
Oct 08 19:15:46 compute-0 ovn_metadata_agent[28637]: 
Oct 08 19:15:46 compute-0 ovn_metadata_agent[28637]: listen listener
Oct 08 19:15:46 compute-0 ovn_metadata_agent[28637]:     bind 169.254.169.254:80
Oct 08 19:15:46 compute-0 ovn_metadata_agent[28637]:     server metadata /var/lib/neutron/metadata_proxy
Oct 08 19:15:46 compute-0 ovn_metadata_agent[28637]:     http-request add-header X-OVN-Network-ID ed492f30-88ab-4074-a37b-2efd9113a46f
Oct 08 19:15:46 compute-0 ovn_metadata_agent[28637]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Oct 08 19:15:46 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:46.025 28643 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ed492f30-88ab-4074-a37b-2efd9113a46f', 'env', 'PROCESS_TAG=haproxy-ed492f30-88ab-4074-a37b-2efd9113a46f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ed492f30-88ab-4074-a37b-2efd9113a46f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Oct 08 19:15:46 compute-0 podman[150977]: 2025-10-08 19:15:46.412511051 +0000 UTC m=+0.047066989 container create 646666533e733350053c63bfd3d1430cc9402ce93fda3a474bbeecdcbd537ad2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-ed492f30-88ab-4074-a37b-2efd9113a46f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct 08 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.446 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:46 compute-0 podman[150977]: 2025-10-08 19:15:46.38993212 +0000 UTC m=+0.024488068 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7
Oct 08 19:15:46 compute-0 systemd[1]: Started libpod-conmon-646666533e733350053c63bfd3d1430cc9402ce93fda3a474bbeecdcbd537ad2.scope.
Oct 08 19:15:46 compute-0 systemd[1]: Started libcrun container.
Oct 08 19:15:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bfed095f12a30ca38c2adabcd7edcb3837429b20db6e6f549181366129732c0f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 08 19:15:46 compute-0 podman[150977]: 2025-10-08 19:15:46.536463085 +0000 UTC m=+0.171019073 container init 646666533e733350053c63bfd3d1430cc9402ce93fda3a474bbeecdcbd537ad2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-ed492f30-88ab-4074-a37b-2efd9113a46f, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 08 19:15:46 compute-0 podman[150977]: 2025-10-08 19:15:46.543406625 +0000 UTC m=+0.177962573 container start 646666533e733350053c63bfd3d1430cc9402ce93fda3a474bbeecdcbd537ad2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-ed492f30-88ab-4074-a37b-2efd9113a46f, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 08 19:15:46 compute-0 neutron-haproxy-ovnmeta-ed492f30-88ab-4074-a37b-2efd9113a46f[150993]: [NOTICE]   (150997) : New worker (150999) forked
Oct 08 19:15:46 compute-0 neutron-haproxy-ovnmeta-ed492f30-88ab-4074-a37b-2efd9113a46f[150993]: [NOTICE]   (150997) : Loading success.
Oct 08 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.578 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950946.5780487, 34ca788d-2398-4a40-9f96-040c0849b18f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 08 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.578 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] VM Started (Lifecycle Event)
Oct 08 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.581 2 DEBUG nova.compute.manager [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Oct 08 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.584 2 DEBUG nova.virt.libvirt.driver [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Oct 08 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.587 2 INFO nova.virt.libvirt.driver [-] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Instance spawned successfully.
Oct 08 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.588 2 DEBUG nova.virt.libvirt.driver [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Oct 08 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.615 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.621 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 08 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.625 2 DEBUG nova.virt.libvirt.driver [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.625 2 DEBUG nova.virt.libvirt.driver [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.626 2 DEBUG nova.virt.libvirt.driver [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.626 2 DEBUG nova.virt.libvirt.driver [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.627 2 DEBUG nova.virt.libvirt.driver [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.627 2 DEBUG nova.virt.libvirt.driver [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 08 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.669 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 08 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.670 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950946.5803564, 34ca788d-2398-4a40-9f96-040c0849b18f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 08 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.671 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] VM Paused (Lifecycle Event)
Oct 08 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.695 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.699 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950946.5837963, 34ca788d-2398-4a40-9f96-040c0849b18f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 08 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.699 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] VM Resumed (Lifecycle Event)
Oct 08 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.703 2 INFO nova.compute.manager [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Took 4.67 seconds to spawn the instance on the hypervisor.
Oct 08 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.704 2 DEBUG nova.compute.manager [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.716 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.720 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 08 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.746 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 08 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.766 2 INFO nova.compute.manager [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Took 5.16 seconds to build instance.
Oct 08 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.784 2 DEBUG oslo_concurrency.lockutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "34ca788d-2398-4a40-9f96-040c0849b18f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.245s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:15:47 compute-0 nova_compute[117514]: 2025-10-08 19:15:47.967 2 DEBUG nova.compute.manager [req-61f8b5ca-003c-4ef2-a0f7-470b907fa277 req-a897d1ba-f550-4274-8459-41bd23f0df0f bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Received event network-vif-plugged-06998e1e-8ce7-484d-b3e4-7d44699229c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:15:47 compute-0 nova_compute[117514]: 2025-10-08 19:15:47.967 2 DEBUG oslo_concurrency.lockutils [req-61f8b5ca-003c-4ef2-a0f7-470b907fa277 req-a897d1ba-f550-4274-8459-41bd23f0df0f bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "34ca788d-2398-4a40-9f96-040c0849b18f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:15:47 compute-0 nova_compute[117514]: 2025-10-08 19:15:47.967 2 DEBUG oslo_concurrency.lockutils [req-61f8b5ca-003c-4ef2-a0f7-470b907fa277 req-a897d1ba-f550-4274-8459-41bd23f0df0f bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "34ca788d-2398-4a40-9f96-040c0849b18f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:15:47 compute-0 nova_compute[117514]: 2025-10-08 19:15:47.968 2 DEBUG oslo_concurrency.lockutils [req-61f8b5ca-003c-4ef2-a0f7-470b907fa277 req-a897d1ba-f550-4274-8459-41bd23f0df0f bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "34ca788d-2398-4a40-9f96-040c0849b18f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:15:47 compute-0 nova_compute[117514]: 2025-10-08 19:15:47.968 2 DEBUG nova.compute.manager [req-61f8b5ca-003c-4ef2-a0f7-470b907fa277 req-a897d1ba-f550-4274-8459-41bd23f0df0f bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] No waiting events found dispatching network-vif-plugged-06998e1e-8ce7-484d-b3e4-7d44699229c4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 08 19:15:47 compute-0 nova_compute[117514]: 2025-10-08 19:15:47.968 2 WARNING nova.compute.manager [req-61f8b5ca-003c-4ef2-a0f7-470b907fa277 req-a897d1ba-f550-4274-8459-41bd23f0df0f bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Received unexpected event network-vif-plugged-06998e1e-8ce7-484d-b3e4-7d44699229c4 for instance with vm_state active and task_state None.
Oct 08 19:15:49 compute-0 nova_compute[117514]: 2025-10-08 19:15:49.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:51 compute-0 nova_compute[117514]: 2025-10-08 19:15:51.449 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:52 compute-0 podman[151008]: 2025-10-08 19:15:52.667203704 +0000 UTC m=+0.080804991 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 08 19:15:53 compute-0 ovn_controller[19759]: 2025-10-08T19:15:53Z|00159|binding|INFO|Releasing lport 58bfd3a1-f863-472a-ae8b-afc52524c7cc from this chassis (sb_readonly=0)
Oct 08 19:15:53 compute-0 NetworkManager[1035]: <info>  [1759950953.3017] manager: (patch-provnet-64c51c9c-a066-44c7-bc3d-9c8bcfc2a465-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/93)
Oct 08 19:15:53 compute-0 NetworkManager[1035]: <info>  [1759950953.3027] manager: (patch-br-int-to-provnet-64c51c9c-a066-44c7-bc3d-9c8bcfc2a465): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/94)
Oct 08 19:15:53 compute-0 nova_compute[117514]: 2025-10-08 19:15:53.309 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:53 compute-0 ovn_controller[19759]: 2025-10-08T19:15:53Z|00160|binding|INFO|Releasing lport 58bfd3a1-f863-472a-ae8b-afc52524c7cc from this chassis (sb_readonly=0)
Oct 08 19:15:53 compute-0 nova_compute[117514]: 2025-10-08 19:15:53.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:53 compute-0 nova_compute[117514]: 2025-10-08 19:15:53.349 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:53 compute-0 nova_compute[117514]: 2025-10-08 19:15:53.652 2 DEBUG nova.compute.manager [req-c9da2321-86a2-4eda-b592-9ca32225bc02 req-49c1532a-ea9d-47d3-9868-3dd390e80aec bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Received event network-changed-06998e1e-8ce7-484d-b3e4-7d44699229c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:15:53 compute-0 nova_compute[117514]: 2025-10-08 19:15:53.652 2 DEBUG nova.compute.manager [req-c9da2321-86a2-4eda-b592-9ca32225bc02 req-49c1532a-ea9d-47d3-9868-3dd390e80aec bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Refreshing instance network info cache due to event network-changed-06998e1e-8ce7-484d-b3e4-7d44699229c4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 08 19:15:53 compute-0 nova_compute[117514]: 2025-10-08 19:15:53.653 2 DEBUG oslo_concurrency.lockutils [req-c9da2321-86a2-4eda-b592-9ca32225bc02 req-49c1532a-ea9d-47d3-9868-3dd390e80aec bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-34ca788d-2398-4a40-9f96-040c0849b18f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:15:53 compute-0 nova_compute[117514]: 2025-10-08 19:15:53.653 2 DEBUG oslo_concurrency.lockutils [req-c9da2321-86a2-4eda-b592-9ca32225bc02 req-49c1532a-ea9d-47d3-9868-3dd390e80aec bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-34ca788d-2398-4a40-9f96-040c0849b18f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:15:53 compute-0 nova_compute[117514]: 2025-10-08 19:15:53.653 2 DEBUG nova.network.neutron [req-c9da2321-86a2-4eda-b592-9ca32225bc02 req-49c1532a-ea9d-47d3-9868-3dd390e80aec bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Refreshing network info cache for port 06998e1e-8ce7-484d-b3e4-7d44699229c4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 08 19:15:54 compute-0 nova_compute[117514]: 2025-10-08 19:15:54.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:55 compute-0 nova_compute[117514]: 2025-10-08 19:15:55.640 2 DEBUG nova.network.neutron [req-c9da2321-86a2-4eda-b592-9ca32225bc02 req-49c1532a-ea9d-47d3-9868-3dd390e80aec bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Updated VIF entry in instance network info cache for port 06998e1e-8ce7-484d-b3e4-7d44699229c4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 08 19:15:55 compute-0 nova_compute[117514]: 2025-10-08 19:15:55.640 2 DEBUG nova.network.neutron [req-c9da2321-86a2-4eda-b592-9ca32225bc02 req-49c1532a-ea9d-47d3-9868-3dd390e80aec bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Updating instance_info_cache with network_info: [{"id": "06998e1e-8ce7-484d-b3e4-7d44699229c4", "address": "fa:16:3e:66:c0:df", "network": {"id": "ed492f30-88ab-4074-a37b-2efd9113a46f", "bridge": "br-int", "label": "tempest-network-smoke--1871519036", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.183", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06998e1e-8c", "ovs_interfaceid": "06998e1e-8ce7-484d-b3e4-7d44699229c4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:15:55 compute-0 nova_compute[117514]: 2025-10-08 19:15:55.660 2 DEBUG oslo_concurrency.lockutils [req-c9da2321-86a2-4eda-b592-9ca32225bc02 req-49c1532a-ea9d-47d3-9868-3dd390e80aec bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-34ca788d-2398-4a40-9f96-040c0849b18f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:15:56 compute-0 nova_compute[117514]: 2025-10-08 19:15:56.480 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:15:57 compute-0 ovn_controller[19759]: 2025-10-08T19:15:57Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:66:c0:df 10.100.0.6
Oct 08 19:15:57 compute-0 ovn_controller[19759]: 2025-10-08T19:15:57Z|00031|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:66:c0:df 10.100.0.6
Oct 08 19:15:59 compute-0 nova_compute[117514]: 2025-10-08 19:15:59.991 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:16:01 compute-0 nova_compute[117514]: 2025-10-08 19:16:01.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:16:02 compute-0 podman[151039]: 2025-10-08 19:16:02.68569629 +0000 UTC m=+0.100255502 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Oct 08 19:16:04 compute-0 nova_compute[117514]: 2025-10-08 19:16:04.038 2 INFO nova.compute.manager [None req-cf079058-f559-43bc-80a7-66aca8d75b7f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Get console output
Oct 08 19:16:04 compute-0 nova_compute[117514]: 2025-10-08 19:16:04.044 54 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Oct 08 19:16:04 compute-0 nova_compute[117514]: 2025-10-08 19:16:04.993 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:16:06 compute-0 nova_compute[117514]: 2025-10-08 19:16:06.529 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:16:07 compute-0 ovn_controller[19759]: 2025-10-08T19:16:07Z|00161|binding|INFO|Releasing lport 58bfd3a1-f863-472a-ae8b-afc52524c7cc from this chassis (sb_readonly=0)
Oct 08 19:16:07 compute-0 nova_compute[117514]: 2025-10-08 19:16:07.217 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:16:07 compute-0 ovn_controller[19759]: 2025-10-08T19:16:07Z|00162|binding|INFO|Releasing lport 58bfd3a1-f863-472a-ae8b-afc52524c7cc from this chassis (sb_readonly=0)
Oct 08 19:16:07 compute-0 nova_compute[117514]: 2025-10-08 19:16:07.301 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.247 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'name': 'tempest-TestNetworkBasicOps-server-91776509', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000000d', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'hostId': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.248 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.275 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/disk.device.write.latency volume: 2531671528 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.276 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '72dc93e4-4db4-4575-acbc-0c2e2b07582a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2531671528, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '34ca788d-2398-4a40-9f96-040c0849b18f-vda', 'timestamp': '2025-10-08T19:16:08.248361', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'instance-0000000d', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3e5df0fc-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.877558981, 'message_signature': '2cca6adb85a607eed5dab499add86d5b03e800faa49c3e816772b47916001841'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '34ca788d-2398-4a40-9f96-040c0849b18f-sda', 'timestamp': '2025-10-08T19:16:08.248361', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'instance-0000000d', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3e5dff98-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.877558981, 'message_signature': '25a16238e93801fd21bf067414414e3f74fa2eb3e56da4ad65d65f7dfc8401e3'}]}, 'timestamp': '2025-10-08 19:16:08.276700', '_unique_id': '21df5d5edc454087b4e47633c16b87b0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.279 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.282 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 34ca788d-2398-4a40-9f96-040c0849b18f / tap06998e1e-8c inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.282 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '96771a21-8611-4f38-8a35-94424aa57293', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-0000000d-34ca788d-2398-4a40-9f96-040c0849b18f-tap06998e1e-8c', 'timestamp': '2025-10-08T19:16:08.279430', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'tap06998e1e-8c', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:66:c0:df', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap06998e1e-8c'}, 'message_id': '3e5efd26-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.908671829, 'message_signature': '5700ad9e7c26f163d2e0fbe4d921acbc0e6336fff5459f91a549d9b6d37171d9'}]}, 'timestamp': '2025-10-08 19:16:08.283192', '_unique_id': '19841d70531e4883975198841fad608f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.284 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.309 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/memory.usage volume: 46.76953125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd59aca0c-2277-4ac8-85c9-b9e1a2bb1cb7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 46.76953125, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'timestamp': '2025-10-08T19:16:08.284401', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'instance-0000000d', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '3e6329e6-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.93888463, 'message_signature': '4a1d82699e05a1f4b51f0654251f7dd7ab16a200c2b38402e5630fcd7a81f930'}]}, 'timestamp': '2025-10-08 19:16:08.310672', '_unique_id': 'eb770c178df04038ad9eb1342c68a7a6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/disk.device.read.requests volume: 1095 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ce5d6500-46a0-4717-b775-15f557ca001b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1095, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '34ca788d-2398-4a40-9f96-040c0849b18f-vda', 'timestamp': '2025-10-08T19:16:08.313001', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'instance-0000000d', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3e6393e0-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.877558981, 'message_signature': '155a16d03a70f7e8b006b4a5c6b86cbede419f6e682eda7487f3604ee58e1c7d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '34ca788d-2398-4a40-9f96-040c0849b18f-sda', 'timestamp': '2025-10-08T19:16:08.313001', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'instance-0000000d', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3e639c00-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.877558981, 'message_signature': 'e5cc09377353420cb56a5ba1a3e2581cde9d4b135f6eb62dd8bdae56e0f9c2bc'}]}, 'timestamp': '2025-10-08 19:16:08.313438', '_unique_id': 'dc35bf077240451b8467291769306fee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.314 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.314 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/network.incoming.bytes volume: 4787 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b322823a-9dd8-4330-a632-81daf3fb8db2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4787, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-0000000d-34ca788d-2398-4a40-9f96-040c0849b18f-tap06998e1e-8c', 'timestamp': '2025-10-08T19:16:08.314593', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'tap06998e1e-8c', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:66:c0:df', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap06998e1e-8c'}, 'message_id': '3e63d24c-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.908671829, 'message_signature': 'fb056a64fc493f4dafa4ab1b1b64277432bca1582c91d8d11d74ebceb63b6066'}]}, 'timestamp': '2025-10-08 19:16:08.314878', '_unique_id': '2a64dac32fca4f22b157665029d2cbcf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.316 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.316 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-91776509>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-91776509>]
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.316 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.316 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/disk.device.read.latency volume: 544247269 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.316 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/disk.device.read.latency volume: 129480851 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ec0f7003-2378-41ac-b812-f33f9a6034a7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 544247269, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '34ca788d-2398-4a40-9f96-040c0849b18f-vda', 'timestamp': '2025-10-08T19:16:08.316351', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'instance-0000000d', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3e6416d0-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.877558981, 'message_signature': '2396955e96e4f1eb7dab241e320d147af8b2dcd3e3c7aa246f4c90358ef6f70e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 129480851, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '34ca788d-2398-4a40-9f96-040c0849b18f-sda', 'timestamp': '2025-10-08T19:16:08.316351', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'instance-0000000d', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3e641f4a-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.877558981, 'message_signature': '1b2fcbfe0ef43c5b8ee8ffcce760a0be58f73e9f3f154240e29da2be8ee36949'}]}, 'timestamp': '2025-10-08 19:16:08.316801', '_unique_id': 'b303460c716d433aa4209cac740f8693'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.319 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.319 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '476bf45b-8594-41e9-869f-f9eb34df3f0f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-0000000d-34ca788d-2398-4a40-9f96-040c0849b18f-tap06998e1e-8c', 'timestamp': '2025-10-08T19:16:08.319364', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'tap06998e1e-8c', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:66:c0:df', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap06998e1e-8c'}, 'message_id': '3e648c0a-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.908671829, 'message_signature': '2fcbb450a2ba4f36f339b9ed4a63299c68ee7cc96edb4c435d188325d46caa79'}]}, 'timestamp': '2025-10-08 19:16:08.319599', '_unique_id': 'a1bfac530f864186af172ebbdf152b1c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4415440a-dbc4-4007-ad0f-5945dc35be65', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-0000000d-34ca788d-2398-4a40-9f96-040c0849b18f-tap06998e1e-8c', 'timestamp': '2025-10-08T19:16:08.320781', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'tap06998e1e-8c', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:66:c0:df', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap06998e1e-8c'}, 'message_id': '3e64c40e-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.908671829, 'message_signature': '0ecb0f67182b8165754589fe2cce7d3e913cd56666d5d57c6fd2614fd79fe981'}]}, 'timestamp': '2025-10-08 19:16:08.321032', '_unique_id': '4f2703e7f82545c4a2b8382c27c882ee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.322 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.322 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/network.outgoing.bytes volume: 3418 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a0e564cc-3577-4dee-9bf8-a8e1813ce352', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3418, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-0000000d-34ca788d-2398-4a40-9f96-040c0849b18f-tap06998e1e-8c', 'timestamp': '2025-10-08T19:16:08.322363', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'tap06998e1e-8c', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:66:c0:df', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap06998e1e-8c'}, 'message_id': '3e65022a-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.908671829, 'message_signature': 'ed2a10d91fb35ae89864b9843209e8cf6c4d1360c00987bd42598d5716a6a0b1'}]}, 'timestamp': '2025-10-08 19:16:08.322658', '_unique_id': '163ab4d64cd440ea87b70d911ca8523a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9cfc94b7-f0e2-488b-b1f5-2d5b2c86fc37', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-0000000d-34ca788d-2398-4a40-9f96-040c0849b18f-tap06998e1e-8c', 'timestamp': '2025-10-08T19:16:08.324047', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'tap06998e1e-8c', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:66:c0:df', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap06998e1e-8c'}, 'message_id': '3e6543de-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.908671829, 'message_signature': 'f0e4d7ae985b4e58b6ec6cf83f7c6f2dfcc914a5322e6d521ccdbbebee823cd3'}]}, 'timestamp': '2025-10-08 19:16:08.324340', '_unique_id': '8244c41ed88a4703a36e1e855f620c08'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.325 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.341 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.341 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1a5a6437-4fa9-4249-a2cd-01597b4b24e6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '34ca788d-2398-4a40-9f96-040c0849b18f-vda', 'timestamp': '2025-10-08T19:16:08.325724', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'instance-0000000d', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3e67e008-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.954958523, 'message_signature': 'e9c0729a8890df944a7fa07691c323a6783f301986a286585b90ba596ecc2ff3'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '34ca788d-2398-4a40-9f96-040c0849b18f-sda', 'timestamp': '2025-10-08T19:16:08.325724', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'instance-0000000d', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3e67ea4e-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.954958523, 'message_signature': '69faa9f89a4b48fa58723b30172ae680d9597365d937600f994222fe3a964020'}]}, 'timestamp': '2025-10-08 19:16:08.341684', '_unique_id': '6c8b0189508440f7bedee4e0cbb03c8d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.343 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.343 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/disk.device.write.requests volume: 315 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.343 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '51614514-a2af-40e6-ae6e-affdac1802a7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 315, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '34ca788d-2398-4a40-9f96-040c0849b18f-vda', 'timestamp': '2025-10-08T19:16:08.343260', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'instance-0000000d', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3e683828-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.877558981, 'message_signature': '33b03d1c55485236214fba45e252be2721e906e517b7346ef6394b768635303f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '34ca788d-2398-4a40-9f96-040c0849b18f-sda', 'timestamp': '2025-10-08T19:16:08.343260', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'instance-0000000d', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3e6840fc-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.877558981, 'message_signature': '078822d540be3fd9e001570b0ab2af9b6bc556f6d1801dac4f2c6af3cd45b08e'}]}, 'timestamp': '2025-10-08 19:16:08.343901', '_unique_id': '6cbb75bc20394891af6b7802b9110c5e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/network.outgoing.packets volume: 28 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8027f9fc-34b1-4fdc-be7a-dbad4b396155', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 28, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-0000000d-34ca788d-2398-4a40-9f96-040c0849b18f-tap06998e1e-8c', 'timestamp': '2025-10-08T19:16:08.345041', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'tap06998e1e-8c', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:66:c0:df', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap06998e1e-8c'}, 'message_id': '3e687702-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.908671829, 'message_signature': '9b57970da1a606f3e1b680603172f5ec3d5c0d69c37fa8fbaf73550cd3ca66ce'}]}, 'timestamp': '2025-10-08 19:16:08.345278', '_unique_id': '1f9885cdb3e041c495c96367a07c70c4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.346 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.346 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/network.incoming.packets volume: 30 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ad60ba67-b4aa-4bbe-b7c9-7ff7ff185fe6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 30, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-0000000d-34ca788d-2398-4a40-9f96-040c0849b18f-tap06998e1e-8c', 'timestamp': '2025-10-08T19:16:08.346339', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'tap06998e1e-8c', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:66:c0:df', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap06998e1e-8c'}, 'message_id': '3e68a970-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.908671829, 'message_signature': '376f391ee88204142899f4c96da426255ca6f8fae2e7637d7c0f672564a19a32'}]}, 'timestamp': '2025-10-08 19:16:08.346563', '_unique_id': '50160e4e2e2041eaa48ff6e95d2e8930'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-91776509>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-91776509>]
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '86b663c9-6515-4ebf-ba39-902cf423d9f0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '34ca788d-2398-4a40-9f96-040c0849b18f-vda', 'timestamp': '2025-10-08T19:16:08.348033', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'instance-0000000d', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3e68eba6-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.954958523, 'message_signature': 'a9bb50635ea3052b4563da5b6bebba8f6cf7576ac33d1e1833865423c9a74d80'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '34ca788d-2398-4a40-9f96-040c0849b18f-sda', 'timestamp': '2025-10-08T19:16:08.348033', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'instance-0000000d', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3e68f470-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.954958523, 'message_signature': 'c952668bfeb2b9a3acd2993c5c7e4d5cb6ebc0eecaf406b575e9464edeb09b6d'}]}, 'timestamp': '2025-10-08 19:16:08.348470', '_unique_id': '2522ece249f94cf995c7f06a02ba2a42'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.349 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.349 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/cpu volume: 10390000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3a7e66e6-c9a9-4987-b71a-7e35cd7f3ed8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10390000000, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'timestamp': '2025-10-08T19:16:08.349688', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'instance-0000000d', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '3e692c6a-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.93888463, 'message_signature': 'b91d0c66008a5cad45e0d1470c16cb16ba4d89ba1d34ad4ead0383f97cd9098d'}]}, 'timestamp': '2025-10-08 19:16:08.349928', '_unique_id': 'e4c48e6c2e50436a82c44b362653319b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.351 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.351 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-91776509>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-91776509>]
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.351 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.351 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/disk.device.write.bytes volume: 72921088 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.351 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aaa7ad5f-aca7-4002-ae9a-4e219493f152', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72921088, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '34ca788d-2398-4a40-9f96-040c0849b18f-vda', 'timestamp': '2025-10-08T19:16:08.351240', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'instance-0000000d', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3e6968c4-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.877558981, 'message_signature': '41db504c1c967a82720ac6d6305a1507d31cdd3c057a5cc55bb21432a93c5bdf'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '34ca788d-2398-4a40-9f96-040c0849b18f-sda', 'timestamp': '2025-10-08T19:16:08.351240', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'instance-0000000d', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3e6970d0-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.877558981, 'message_signature': 'edf959db9ab02fb313f0c69647ae15094bcfd5cbfbbc43b6b635502b919918ad'}]}, 'timestamp': '2025-10-08 19:16:08.351666', '_unique_id': '9e48724af9cc43ea8e2d98899f32f6a4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/disk.device.read.bytes volume: 30534144 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f4afe885-ebc6-4093-a691-45f21730c944', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30534144, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '34ca788d-2398-4a40-9f96-040c0849b18f-vda', 'timestamp': '2025-10-08T19:16:08.352763', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'instance-0000000d', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3e69a53c-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.877558981, 'message_signature': 'd1c9da380ffd3fcc4b7c8624e09b9ee156440b08dfdb73b7d99b89dd1a513ff6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '34ca788d-2398-4a40-9f96-040c0849b18f-sda', 'timestamp': '2025-10-08T19:16:08.352763', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'instance-0000000d', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3e69ad20-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.877558981, 'message_signature': 'd06543f1134237cb3af1ed1878c9bd95b54fc0bbb22cc6636721c045b1796522'}]}, 'timestamp': '2025-10-08 19:16:08.353197', '_unique_id': '2882f3270d8345ce80be62bc7da8ec1e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.354 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.354 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.354 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9d736ac4-516d-46f0-98ea-feac04a96ea3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '34ca788d-2398-4a40-9f96-040c0849b18f-vda', 'timestamp': '2025-10-08T19:16:08.354283', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'instance-0000000d', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3e69dfb6-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.954958523, 'message_signature': 'fb850d79a02b462f269d981ba43243a205aae1a8f30b8dab7c79f630f7c0c8cc'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '34ca788d-2398-4a40-9f96-040c0849b18f-sda', 'timestamp': '2025-10-08T19:16:08.354283', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'instance-0000000d', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3e69e74a-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.954958523, 'message_signature': 'a7a04f422489f2dc533b6a48fc5164bee6bd81a5a60e91aa3bb10f93c6f58de7'}]}, 'timestamp': '2025-10-08 19:16:08.354687', '_unique_id': '6244346e4a1d44a298f29715442aa3a7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f02e40c8-c860-49a5-a46a-28aa516f4d16', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-0000000d-34ca788d-2398-4a40-9f96-040c0849b18f-tap06998e1e-8c', 'timestamp': '2025-10-08T19:16:08.355764', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'tap06998e1e-8c', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:66:c0:df', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap06998e1e-8c'}, 'message_id': '3e6a1a3a-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.908671829, 'message_signature': '14552c990777bf9130f26c5beee75171d7c8ded823d76484fa5eff514e5ac3d7'}]}, 'timestamp': '2025-10-08 19:16:08.356006', '_unique_id': '83d2c77094d94ce1901f74004919dd70'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2ca24662-01da-4cb8-b9b1-84d596be3cbc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-0000000d-34ca788d-2398-4a40-9f96-040c0849b18f-tap06998e1e-8c', 'timestamp': '2025-10-08T19:16:08.357073', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'tap06998e1e-8c', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:66:c0:df', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap06998e1e-8c'}, 'message_id': '3e6a4ca8-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.908671829, 'message_signature': '6691dbfe056510ebc14d82f7073dc37d9bdf3ea0c2629b5c8bf14528f3623316'}]}, 'timestamp': '2025-10-08 19:16:08.357300', '_unique_id': '3ecff87bcd0b42c2b65d18acbad87a7a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging     yield
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging 
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.358 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.358 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct 08 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.358 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-91776509>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-91776509>]
Oct 08 19:16:08 compute-0 nova_compute[117514]: 2025-10-08 19:16:08.416 2 INFO nova.compute.manager [None req-7e70d0df-0954-4234-a0c8-1d8584090420 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Get console output
Oct 08 19:16:08 compute-0 nova_compute[117514]: 2025-10-08 19:16:08.424 54 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Oct 08 19:16:08 compute-0 podman[151060]: 2025-10-08 19:16:08.656112657 +0000 UTC m=+0.070578416 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.openshift.expose-services=, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.openshift.tags=minimal rhel9, release=1755695350, managed_by=edpm_ansible, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vcs-type=git, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct 08 19:16:08 compute-0 podman[151062]: 2025-10-08 19:16:08.689414697 +0000 UTC m=+0.086812814 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct 08 19:16:08 compute-0 podman[151061]: 2025-10-08 19:16:08.705737878 +0000 UTC m=+0.110314532 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251001, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct 08 19:16:08 compute-0 nova_compute[117514]: 2025-10-08 19:16:08.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:16:09 compute-0 nova_compute[117514]: 2025-10-08 19:16:09.543 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:16:09 compute-0 NetworkManager[1035]: <info>  [1759950969.5446] manager: (patch-provnet-64c51c9c-a066-44c7-bc3d-9c8bcfc2a465-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/95)
Oct 08 19:16:09 compute-0 NetworkManager[1035]: <info>  [1759950969.5463] manager: (patch-br-int-to-provnet-64c51c9c-a066-44c7-bc3d-9c8bcfc2a465): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/96)
Oct 08 19:16:09 compute-0 ovn_controller[19759]: 2025-10-08T19:16:09Z|00163|binding|INFO|Releasing lport 58bfd3a1-f863-472a-ae8b-afc52524c7cc from this chassis (sb_readonly=0)
Oct 08 19:16:09 compute-0 nova_compute[117514]: 2025-10-08 19:16:09.633 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:16:09 compute-0 nova_compute[117514]: 2025-10-08 19:16:09.646 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:16:09 compute-0 nova_compute[117514]: 2025-10-08 19:16:09.802 2 INFO nova.compute.manager [None req-5300206d-4729-458f-b7a9-d3b3ebee095e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Get console output
Oct 08 19:16:09 compute-0 nova_compute[117514]: 2025-10-08 19:16:09.807 54 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Oct 08 19:16:09 compute-0 nova_compute[117514]: 2025-10-08 19:16:09.996 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.481 2 DEBUG nova.compute.manager [req-22c2d281-ede0-4d3d-b098-ae60029ca9f2 req-5634af68-6737-47e9-a68f-d3ab6250645b bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Received event network-changed-06998e1e-8ce7-484d-b3e4-7d44699229c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.482 2 DEBUG nova.compute.manager [req-22c2d281-ede0-4d3d-b098-ae60029ca9f2 req-5634af68-6737-47e9-a68f-d3ab6250645b bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Refreshing instance network info cache due to event network-changed-06998e1e-8ce7-484d-b3e4-7d44699229c4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 08 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.482 2 DEBUG oslo_concurrency.lockutils [req-22c2d281-ede0-4d3d-b098-ae60029ca9f2 req-5634af68-6737-47e9-a68f-d3ab6250645b bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-34ca788d-2398-4a40-9f96-040c0849b18f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 08 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.483 2 DEBUG oslo_concurrency.lockutils [req-22c2d281-ede0-4d3d-b098-ae60029ca9f2 req-5634af68-6737-47e9-a68f-d3ab6250645b bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-34ca788d-2398-4a40-9f96-040c0849b18f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 08 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.483 2 DEBUG nova.network.neutron [req-22c2d281-ede0-4d3d-b098-ae60029ca9f2 req-5634af68-6737-47e9-a68f-d3ab6250645b bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Refreshing network info cache for port 06998e1e-8ce7-484d-b3e4-7d44699229c4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 08 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.521 2 DEBUG oslo_concurrency.lockutils [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "34ca788d-2398-4a40-9f96-040c0849b18f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.522 2 DEBUG oslo_concurrency.lockutils [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "34ca788d-2398-4a40-9f96-040c0849b18f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.523 2 DEBUG oslo_concurrency.lockutils [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "34ca788d-2398-4a40-9f96-040c0849b18f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.523 2 DEBUG oslo_concurrency.lockutils [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "34ca788d-2398-4a40-9f96-040c0849b18f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.523 2 DEBUG oslo_concurrency.lockutils [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "34ca788d-2398-4a40-9f96-040c0849b18f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.524 2 INFO nova.compute.manager [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Terminating instance
Oct 08 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.526 2 DEBUG nova.compute.manager [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Oct 08 19:16:10 compute-0 kernel: tap06998e1e-8c (unregistering): left promiscuous mode
Oct 08 19:16:10 compute-0 NetworkManager[1035]: <info>  [1759950970.5523] device (tap06998e1e-8c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 08 19:16:10 compute-0 ovn_controller[19759]: 2025-10-08T19:16:10Z|00164|binding|INFO|Releasing lport 06998e1e-8ce7-484d-b3e4-7d44699229c4 from this chassis (sb_readonly=0)
Oct 08 19:16:10 compute-0 ovn_controller[19759]: 2025-10-08T19:16:10Z|00165|binding|INFO|Setting lport 06998e1e-8ce7-484d-b3e4-7d44699229c4 down in Southbound
Oct 08 19:16:10 compute-0 ovn_controller[19759]: 2025-10-08T19:16:10Z|00166|binding|INFO|Removing iface tap06998e1e-8c ovn-installed in OVS
Oct 08 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.564 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.567 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:16:10 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:16:10.580 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:66:c0:df 10.100.0.6'], port_security=['fa:16:3e:66:c0:df 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ed492f30-88ab-4074-a37b-2efd9113a46f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0aa04153-3da7-40f5-b74d-f2ebacf56fd3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d714e95d-17df-46e0-aa89-985c7cbd12a3, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=06998e1e-8ce7-484d-b3e4-7d44699229c4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 08 19:16:10 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:16:10.584 28643 INFO neutron.agent.ovn.metadata.agent [-] Port 06998e1e-8ce7-484d-b3e4-7d44699229c4 in datapath ed492f30-88ab-4074-a37b-2efd9113a46f unbound from our chassis
Oct 08 19:16:10 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:16:10.586 28643 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ed492f30-88ab-4074-a37b-2efd9113a46f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 08 19:16:10 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:16:10.588 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[50f3775a-0542-48bf-8062-59cc90e8b7c4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:16:10 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:16:10.589 28643 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ed492f30-88ab-4074-a37b-2efd9113a46f namespace which is not needed anymore
Oct 08 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.597 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:16:10 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Oct 08 19:16:10 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000000d.scope: Consumed 12.617s CPU time.
Oct 08 19:16:10 compute-0 systemd-machined[77568]: Machine qemu-13-instance-0000000d terminated.
Oct 08 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.757 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.764 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:16:10 compute-0 podman[151138]: 2025-10-08 19:16:10.793898658 +0000 UTC m=+0.100222571 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 08 19:16:10 compute-0 neutron-haproxy-ovnmeta-ed492f30-88ab-4074-a37b-2efd9113a46f[150993]: [NOTICE]   (150997) : haproxy version is 2.8.14-c23fe91
Oct 08 19:16:10 compute-0 neutron-haproxy-ovnmeta-ed492f30-88ab-4074-a37b-2efd9113a46f[150993]: [NOTICE]   (150997) : path to executable is /usr/sbin/haproxy
Oct 08 19:16:10 compute-0 neutron-haproxy-ovnmeta-ed492f30-88ab-4074-a37b-2efd9113a46f[150993]: [ALERT]    (150997) : Current worker (150999) exited with code 143 (Terminated)
Oct 08 19:16:10 compute-0 neutron-haproxy-ovnmeta-ed492f30-88ab-4074-a37b-2efd9113a46f[150993]: [WARNING]  (150997) : All workers exited. Exiting... (0)
Oct 08 19:16:10 compute-0 systemd[1]: libpod-646666533e733350053c63bfd3d1430cc9402ce93fda3a474bbeecdcbd537ad2.scope: Deactivated successfully.
Oct 08 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.804 2 INFO nova.virt.libvirt.driver [-] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Instance destroyed successfully.
Oct 08 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.805 2 DEBUG nova.objects.instance [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'resources' on Instance uuid 34ca788d-2398-4a40-9f96-040c0849b18f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 08 19:16:10 compute-0 podman[151156]: 2025-10-08 19:16:10.81064566 +0000 UTC m=+0.068543187 container died 646666533e733350053c63bfd3d1430cc9402ce93fda3a474bbeecdcbd537ad2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-ed492f30-88ab-4074-a37b-2efd9113a46f, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 08 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.825 2 DEBUG nova.virt.libvirt.vif [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T19:15:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-91776509',display_name='tempest-TestNetworkBasicOps-server-91776509',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-91776509',id=13,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI8PTrGv1QybFIubtsg8lczGea0IvQL8pvhihemAZSj0UMnf1scRH00KmJvAMVhcwpSfJBSsSB9h8z57cU6NeYho/jEOEiMidDlTZU4qxsLiPufykBInXUSkP3hGqOiJaw==',key_name='tempest-TestNetworkBasicOps-916834063',keypairs=<?>,launch_index=0,launched_at=2025-10-08T19:15:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-q8bisj0t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T19:15:46Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=34ca788d-2398-4a40-9f96-040c0849b18f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "06998e1e-8ce7-484d-b3e4-7d44699229c4", "address": "fa:16:3e:66:c0:df", "network": {"id": "ed492f30-88ab-4074-a37b-2efd9113a46f", "bridge": "br-int", "label": "tempest-network-smoke--1871519036", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.183", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06998e1e-8c", "ovs_interfaceid": "06998e1e-8ce7-484d-b3e4-7d44699229c4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Oct 08 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.826 2 DEBUG nova.network.os_vif_util [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "06998e1e-8ce7-484d-b3e4-7d44699229c4", "address": "fa:16:3e:66:c0:df", "network": {"id": "ed492f30-88ab-4074-a37b-2efd9113a46f", "bridge": "br-int", "label": "tempest-network-smoke--1871519036", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.183", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06998e1e-8c", "ovs_interfaceid": "06998e1e-8ce7-484d-b3e4-7d44699229c4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 08 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.826 2 DEBUG nova.network.os_vif_util [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:66:c0:df,bridge_name='br-int',has_traffic_filtering=True,id=06998e1e-8ce7-484d-b3e4-7d44699229c4,network=Network(ed492f30-88ab-4074-a37b-2efd9113a46f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06998e1e-8c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 08 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.827 2 DEBUG os_vif [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:66:c0:df,bridge_name='br-int',has_traffic_filtering=True,id=06998e1e-8ce7-484d-b3e4-7d44699229c4,network=Network(ed492f30-88ab-4074-a37b-2efd9113a46f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06998e1e-8c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Oct 08 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.829 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.829 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap06998e1e-8c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.834 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.838 2 INFO os_vif [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:66:c0:df,bridge_name='br-int',has_traffic_filtering=True,id=06998e1e-8ce7-484d-b3e4-7d44699229c4,network=Network(ed492f30-88ab-4074-a37b-2efd9113a46f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06998e1e-8c')
Oct 08 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.839 2 INFO nova.virt.libvirt.driver [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Deleting instance files /var/lib/nova/instances/34ca788d-2398-4a40-9f96-040c0849b18f_del
Oct 08 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.839 2 INFO nova.virt.libvirt.driver [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Deletion of /var/lib/nova/instances/34ca788d-2398-4a40-9f96-040c0849b18f_del complete
Oct 08 19:16:10 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-646666533e733350053c63bfd3d1430cc9402ce93fda3a474bbeecdcbd537ad2-userdata-shm.mount: Deactivated successfully.
Oct 08 19:16:10 compute-0 systemd[1]: var-lib-containers-storage-overlay-bfed095f12a30ca38c2adabcd7edcb3837429b20db6e6f549181366129732c0f-merged.mount: Deactivated successfully.
Oct 08 19:16:10 compute-0 podman[151156]: 2025-10-08 19:16:10.87442598 +0000 UTC m=+0.132323467 container cleanup 646666533e733350053c63bfd3d1430cc9402ce93fda3a474bbeecdcbd537ad2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-ed492f30-88ab-4074-a37b-2efd9113a46f, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 08 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.884 2 INFO nova.compute.manager [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Took 0.36 seconds to destroy the instance on the hypervisor.
Oct 08 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.884 2 DEBUG oslo.service.loopingcall [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Oct 08 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.884 2 DEBUG nova.compute.manager [-] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Oct 08 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.885 2 DEBUG nova.network.neutron [-] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Oct 08 19:16:10 compute-0 systemd[1]: libpod-conmon-646666533e733350053c63bfd3d1430cc9402ce93fda3a474bbeecdcbd537ad2.scope: Deactivated successfully.
Oct 08 19:16:10 compute-0 podman[151185]: 2025-10-08 19:16:10.909732478 +0000 UTC m=+0.099954153 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true, config_id=iscsid, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 08 19:16:10 compute-0 podman[151194]: 2025-10-08 19:16:10.939157126 +0000 UTC m=+0.120342061 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 08 19:16:10 compute-0 podman[151239]: 2025-10-08 19:16:10.952885242 +0000 UTC m=+0.051585709 container remove 646666533e733350053c63bfd3d1430cc9402ce93fda3a474bbeecdcbd537ad2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-ed492f30-88ab-4074-a37b-2efd9113a46f, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct 08 19:16:10 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:16:10.959 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[ec551f93-a670-45bc-9b6f-75c994d318ed]: (4, ('Wed Oct  8 07:16:10 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-ed492f30-88ab-4074-a37b-2efd9113a46f (646666533e733350053c63bfd3d1430cc9402ce93fda3a474bbeecdcbd537ad2)\n646666533e733350053c63bfd3d1430cc9402ce93fda3a474bbeecdcbd537ad2\nWed Oct  8 07:16:10 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ed492f30-88ab-4074-a37b-2efd9113a46f (646666533e733350053c63bfd3d1430cc9402ce93fda3a474bbeecdcbd537ad2)\n646666533e733350053c63bfd3d1430cc9402ce93fda3a474bbeecdcbd537ad2\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:16:10 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:16:10.961 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[3c05792e-9db9-4474-9e7f-753ed98e7909]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:16:10 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:16:10.962 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=taped492f30-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.964 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:16:10 compute-0 kernel: taped492f30-80: left promiscuous mode
Oct 08 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:16:10 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:16:10.984 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[787aa40e-35b9-4dfc-85bc-51cc88d2fb7a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:16:11 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:16:11.025 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[fbac3123-a210-4de3-b7c6-e0deff758349]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:16:11 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:16:11.034 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[f2999cc5-91d4-4680-83c2-0ba2a66c3e4b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:16:11 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:16:11.054 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[b8f812bb-8a32-4f02-a448-115a73cc6674]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 161626, 'reachable_time': 44684, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 151265, 'error': None, 'target': 'ovnmeta-ed492f30-88ab-4074-a37b-2efd9113a46f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:16:11 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:16:11.057 28783 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ed492f30-88ab-4074-a37b-2efd9113a46f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Oct 08 19:16:11 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:16:11.057 28783 DEBUG oslo.privsep.daemon [-] privsep: reply[54c88c11-2272-4fab-a658-441017e0f122]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 08 19:16:11 compute-0 systemd[1]: run-netns-ovnmeta\x2ded492f30\x2d88ab\x2d4074\x2da37b\x2d2efd9113a46f.mount: Deactivated successfully.
Oct 08 19:16:11 compute-0 nova_compute[117514]: 2025-10-08 19:16:11.531 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:16:11 compute-0 nova_compute[117514]: 2025-10-08 19:16:11.713 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:16:11 compute-0 nova_compute[117514]: 2025-10-08 19:16:11.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:16:11 compute-0 nova_compute[117514]: 2025-10-08 19:16:11.722 2 DEBUG nova.network.neutron [-] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:16:11 compute-0 nova_compute[117514]: 2025-10-08 19:16:11.746 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:16:11 compute-0 nova_compute[117514]: 2025-10-08 19:16:11.747 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:16:11 compute-0 nova_compute[117514]: 2025-10-08 19:16:11.747 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:16:11 compute-0 nova_compute[117514]: 2025-10-08 19:16:11.748 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 08 19:16:11 compute-0 nova_compute[117514]: 2025-10-08 19:16:11.749 2 INFO nova.compute.manager [-] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Took 0.86 seconds to deallocate network for instance.
Oct 08 19:16:11 compute-0 nova_compute[117514]: 2025-10-08 19:16:11.794 2 DEBUG oslo_concurrency.lockutils [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:16:11 compute-0 nova_compute[117514]: 2025-10-08 19:16:11.795 2 DEBUG oslo_concurrency.lockutils [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:16:11 compute-0 nova_compute[117514]: 2025-10-08 19:16:11.900 2 DEBUG nova.scheduler.client.report [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Refreshing inventories for resource provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Oct 08 19:16:11 compute-0 nova_compute[117514]: 2025-10-08 19:16:11.948 2 DEBUG nova.scheduler.client.report [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Updating ProviderTree inventory for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Oct 08 19:16:11 compute-0 nova_compute[117514]: 2025-10-08 19:16:11.950 2 DEBUG nova.compute.provider_tree [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Updating inventory in ProviderTree for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Oct 08 19:16:11 compute-0 nova_compute[117514]: 2025-10-08 19:16:11.997 2 DEBUG nova.scheduler.client.report [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Refreshing aggregate associations for resource provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Oct 08 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.019 2 DEBUG nova.scheduler.client.report [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Refreshing trait associations for resource provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349, traits: COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_FDC,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE2,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_MMX,HW_CPU_X86_AESNI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_ACCELERATORS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE42,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_AVX,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_FMA3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SVM,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NODE,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSSE3,HW_CPU_X86_AVX2,HW_CPU_X86_SSE,HW_CPU_X86_CLMUL,HW_CPU_X86_F16C,HW_CPU_X86_AMD_SVM,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_BMI,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_PCNET _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Oct 08 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.066 2 WARNING nova.virt.libvirt.driver [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 08 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.067 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6072MB free_disk=73.41375732421875GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 08 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.068 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.072 2 DEBUG nova.compute.provider_tree [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 08 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.088 2 DEBUG nova.scheduler.client.report [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 08 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.115 2 DEBUG oslo_concurrency.lockutils [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.320s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.119 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.051s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.140 2 INFO nova.scheduler.client.report [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Deleted allocations for instance 34ca788d-2398-4a40-9f96-040c0849b18f
Oct 08 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.183 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 08 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.184 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 08 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.199 2 DEBUG nova.network.neutron [req-22c2d281-ede0-4d3d-b098-ae60029ca9f2 req-5634af68-6737-47e9-a68f-d3ab6250645b bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Updated VIF entry in instance network info cache for port 06998e1e-8ce7-484d-b3e4-7d44699229c4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 08 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.200 2 DEBUG nova.network.neutron [req-22c2d281-ede0-4d3d-b098-ae60029ca9f2 req-5634af68-6737-47e9-a68f-d3ab6250645b bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Updating instance_info_cache with network_info: [{"id": "06998e1e-8ce7-484d-b3e4-7d44699229c4", "address": "fa:16:3e:66:c0:df", "network": {"id": "ed492f30-88ab-4074-a37b-2efd9113a46f", "bridge": "br-int", "label": "tempest-network-smoke--1871519036", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06998e1e-8c", "ovs_interfaceid": "06998e1e-8ce7-484d-b3e4-7d44699229c4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 08 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.210 2 DEBUG nova.compute.provider_tree [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 08 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.228 2 DEBUG oslo_concurrency.lockutils [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "34ca788d-2398-4a40-9f96-040c0849b18f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.706s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.230 2 DEBUG oslo_concurrency.lockutils [req-22c2d281-ede0-4d3d-b098-ae60029ca9f2 req-5634af68-6737-47e9-a68f-d3ab6250645b bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-34ca788d-2398-4a40-9f96-040c0849b18f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 08 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.231 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 08 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.250 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 08 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.250 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.554 2 DEBUG nova.compute.manager [req-3a1eb7ba-716b-4e53-b280-1e2be26a415a req-6b33ecbc-e6e4-4629-92dc-935fea6f8c40 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Received event network-vif-unplugged-06998e1e-8ce7-484d-b3e4-7d44699229c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.555 2 DEBUG oslo_concurrency.lockutils [req-3a1eb7ba-716b-4e53-b280-1e2be26a415a req-6b33ecbc-e6e4-4629-92dc-935fea6f8c40 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "34ca788d-2398-4a40-9f96-040c0849b18f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.555 2 DEBUG oslo_concurrency.lockutils [req-3a1eb7ba-716b-4e53-b280-1e2be26a415a req-6b33ecbc-e6e4-4629-92dc-935fea6f8c40 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "34ca788d-2398-4a40-9f96-040c0849b18f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.556 2 DEBUG oslo_concurrency.lockutils [req-3a1eb7ba-716b-4e53-b280-1e2be26a415a req-6b33ecbc-e6e4-4629-92dc-935fea6f8c40 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "34ca788d-2398-4a40-9f96-040c0849b18f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.556 2 DEBUG nova.compute.manager [req-3a1eb7ba-716b-4e53-b280-1e2be26a415a req-6b33ecbc-e6e4-4629-92dc-935fea6f8c40 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] No waiting events found dispatching network-vif-unplugged-06998e1e-8ce7-484d-b3e4-7d44699229c4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 08 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.557 2 WARNING nova.compute.manager [req-3a1eb7ba-716b-4e53-b280-1e2be26a415a req-6b33ecbc-e6e4-4629-92dc-935fea6f8c40 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Received unexpected event network-vif-unplugged-06998e1e-8ce7-484d-b3e4-7d44699229c4 for instance with vm_state deleted and task_state None.
Oct 08 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.557 2 DEBUG nova.compute.manager [req-3a1eb7ba-716b-4e53-b280-1e2be26a415a req-6b33ecbc-e6e4-4629-92dc-935fea6f8c40 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Received event network-vif-plugged-06998e1e-8ce7-484d-b3e4-7d44699229c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.558 2 DEBUG oslo_concurrency.lockutils [req-3a1eb7ba-716b-4e53-b280-1e2be26a415a req-6b33ecbc-e6e4-4629-92dc-935fea6f8c40 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "34ca788d-2398-4a40-9f96-040c0849b18f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.558 2 DEBUG oslo_concurrency.lockutils [req-3a1eb7ba-716b-4e53-b280-1e2be26a415a req-6b33ecbc-e6e4-4629-92dc-935fea6f8c40 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "34ca788d-2398-4a40-9f96-040c0849b18f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.559 2 DEBUG oslo_concurrency.lockutils [req-3a1eb7ba-716b-4e53-b280-1e2be26a415a req-6b33ecbc-e6e4-4629-92dc-935fea6f8c40 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "34ca788d-2398-4a40-9f96-040c0849b18f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.559 2 DEBUG nova.compute.manager [req-3a1eb7ba-716b-4e53-b280-1e2be26a415a req-6b33ecbc-e6e4-4629-92dc-935fea6f8c40 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] No waiting events found dispatching network-vif-plugged-06998e1e-8ce7-484d-b3e4-7d44699229c4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 08 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.559 2 WARNING nova.compute.manager [req-3a1eb7ba-716b-4e53-b280-1e2be26a415a req-6b33ecbc-e6e4-4629-92dc-935fea6f8c40 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Received unexpected event network-vif-plugged-06998e1e-8ce7-484d-b3e4-7d44699229c4 for instance with vm_state deleted and task_state None.
Oct 08 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.560 2 DEBUG nova.compute.manager [req-3a1eb7ba-716b-4e53-b280-1e2be26a415a req-6b33ecbc-e6e4-4629-92dc-935fea6f8c40 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Received event network-vif-deleted-06998e1e-8ce7-484d-b3e4-7d44699229c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 08 19:16:13 compute-0 nova_compute[117514]: 2025-10-08 19:16:13.251 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:16:13 compute-0 nova_compute[117514]: 2025-10-08 19:16:13.252 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 08 19:16:13 compute-0 nova_compute[117514]: 2025-10-08 19:16:13.273 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 08 19:16:13 compute-0 nova_compute[117514]: 2025-10-08 19:16:13.274 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:16:13 compute-0 nova_compute[117514]: 2025-10-08 19:16:13.274 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 08 19:16:13 compute-0 nova_compute[117514]: 2025-10-08 19:16:13.718 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:16:13 compute-0 nova_compute[117514]: 2025-10-08 19:16:13.718 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:16:14 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:16:14.651 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a6:75:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '5e:14:dd:63:55:2a'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 08 19:16:14 compute-0 nova_compute[117514]: 2025-10-08 19:16:14.652 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:16:14 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:16:14.654 28643 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 08 19:16:14 compute-0 nova_compute[117514]: 2025-10-08 19:16:14.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:16:14 compute-0 nova_compute[117514]: 2025-10-08 19:16:14.717 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Oct 08 19:16:15 compute-0 nova_compute[117514]: 2025-10-08 19:16:15.732 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:16:15 compute-0 nova_compute[117514]: 2025-10-08 19:16:15.835 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:16:16 compute-0 nova_compute[117514]: 2025-10-08 19:16:16.533 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:16:16 compute-0 nova_compute[117514]: 2025-10-08 19:16:16.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:16:16 compute-0 nova_compute[117514]: 2025-10-08 19:16:16.718 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:16:16 compute-0 nova_compute[117514]: 2025-10-08 19:16:16.804 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:16:16 compute-0 nova_compute[117514]: 2025-10-08 19:16:16.927 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:16:18 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:16:18.655 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f81f7a-64d8-418a-a74c-b879bd6deb83, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:16:19 compute-0 nova_compute[117514]: 2025-10-08 19:16:19.729 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:16:19 compute-0 nova_compute[117514]: 2025-10-08 19:16:19.729 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Oct 08 19:16:19 compute-0 nova_compute[117514]: 2025-10-08 19:16:19.747 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Oct 08 19:16:20 compute-0 nova_compute[117514]: 2025-10-08 19:16:20.839 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:16:21 compute-0 nova_compute[117514]: 2025-10-08 19:16:21.536 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:16:23 compute-0 podman[151268]: 2025-10-08 19:16:23.665134398 +0000 UTC m=+0.083793687 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 08 19:16:25 compute-0 nova_compute[117514]: 2025-10-08 19:16:25.803 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759950970.8029397, 34ca788d-2398-4a40-9f96-040c0849b18f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 08 19:16:25 compute-0 nova_compute[117514]: 2025-10-08 19:16:25.804 2 INFO nova.compute.manager [-] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] VM Stopped (Lifecycle Event)
Oct 08 19:16:25 compute-0 nova_compute[117514]: 2025-10-08 19:16:25.834 2 DEBUG nova.compute.manager [None req-a9321ac2-9f92-477a-907c-9a01fc35389e - - - - - -] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 08 19:16:25 compute-0 nova_compute[117514]: 2025-10-08 19:16:25.843 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:16:26 compute-0 nova_compute[117514]: 2025-10-08 19:16:26.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:16:30 compute-0 nova_compute[117514]: 2025-10-08 19:16:30.848 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:16:31 compute-0 nova_compute[117514]: 2025-10-08 19:16:31.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:16:33 compute-0 podman[151292]: 2025-10-08 19:16:33.677362214 +0000 UTC m=+0.094950860 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct 08 19:16:35 compute-0 nova_compute[117514]: 2025-10-08 19:16:35.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:16:36 compute-0 nova_compute[117514]: 2025-10-08 19:16:36.542 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:16:39 compute-0 podman[151314]: 2025-10-08 19:16:39.692885232 +0000 UTC m=+0.091094958 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 08 19:16:39 compute-0 podman[151313]: 2025-10-08 19:16:39.69318219 +0000 UTC m=+0.097769080 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 08 19:16:39 compute-0 podman[151312]: 2025-10-08 19:16:39.704019933 +0000 UTC m=+0.112703091 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, distribution-scope=public, version=9.6, architecture=x86_64, config_id=edpm, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41)
Oct 08 19:16:40 compute-0 nova_compute[117514]: 2025-10-08 19:16:40.856 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:16:41 compute-0 nova_compute[117514]: 2025-10-08 19:16:41.543 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:16:41 compute-0 podman[151375]: 2025-10-08 19:16:41.665412406 +0000 UTC m=+0.072727328 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3)
Oct 08 19:16:41 compute-0 podman[151373]: 2025-10-08 19:16:41.685805344 +0000 UTC m=+0.093350193 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true)
Oct 08 19:16:41 compute-0 podman[151374]: 2025-10-08 19:16:41.718149336 +0000 UTC m=+0.120886366 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true)
Oct 08 19:16:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:16:44.235 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:16:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:16:44.236 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:16:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:16:44.236 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:16:45 compute-0 nova_compute[117514]: 2025-10-08 19:16:45.912 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:16:46 compute-0 nova_compute[117514]: 2025-10-08 19:16:46.544 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:16:47 compute-0 ovn_controller[19759]: 2025-10-08T19:16:47Z|00167|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Oct 08 19:16:50 compute-0 nova_compute[117514]: 2025-10-08 19:16:50.917 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:16:51 compute-0 nova_compute[117514]: 2025-10-08 19:16:51.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:16:54 compute-0 podman[151432]: 2025-10-08 19:16:54.68241522 +0000 UTC m=+0.099947464 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 08 19:16:55 compute-0 nova_compute[117514]: 2025-10-08 19:16:55.919 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:16:56 compute-0 nova_compute[117514]: 2025-10-08 19:16:56.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:17:00 compute-0 nova_compute[117514]: 2025-10-08 19:17:00.922 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:17:01 compute-0 nova_compute[117514]: 2025-10-08 19:17:01.551 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:17:03 compute-0 sshd-session[151457]: Accepted publickey for zuul from 192.168.122.10 port 53994 ssh2: ECDSA SHA256:i+73Mx2Y/ukt1b+huf+9w+ftZalnyybbDU6glTR0JfU
Oct 08 19:17:03 compute-0 systemd[1]: Created slice User Slice of UID 1000.
Oct 08 19:17:03 compute-0 systemd[1]: Starting User Runtime Directory /run/user/1000...
Oct 08 19:17:03 compute-0 systemd-logind[844]: New session 12 of user zuul.
Oct 08 19:17:03 compute-0 systemd[1]: Finished User Runtime Directory /run/user/1000.
Oct 08 19:17:03 compute-0 systemd[1]: Starting User Manager for UID 1000...
Oct 08 19:17:03 compute-0 systemd[151478]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 08 19:17:03 compute-0 podman[151459]: 2025-10-08 19:17:03.865987464 +0000 UTC m=+0.118899790 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 08 19:17:04 compute-0 systemd[151478]: Queued start job for default target Main User Target.
Oct 08 19:17:04 compute-0 systemd[151478]: Created slice User Application Slice.
Oct 08 19:17:04 compute-0 systemd[151478]: Started Mark boot as successful after the user session has run 2 minutes.
Oct 08 19:17:04 compute-0 systemd[151478]: Started Daily Cleanup of User's Temporary Directories.
Oct 08 19:17:04 compute-0 systemd[151478]: Reached target Paths.
Oct 08 19:17:04 compute-0 systemd[151478]: Reached target Timers.
Oct 08 19:17:04 compute-0 systemd[151478]: Starting D-Bus User Message Bus Socket...
Oct 08 19:17:04 compute-0 systemd[151478]: Starting Create User's Volatile Files and Directories...
Oct 08 19:17:04 compute-0 systemd[151478]: Listening on D-Bus User Message Bus Socket.
Oct 08 19:17:04 compute-0 systemd[151478]: Reached target Sockets.
Oct 08 19:17:04 compute-0 systemd[151478]: Finished Create User's Volatile Files and Directories.
Oct 08 19:17:04 compute-0 systemd[151478]: Reached target Basic System.
Oct 08 19:17:04 compute-0 systemd[151478]: Reached target Main User Target.
Oct 08 19:17:04 compute-0 systemd[151478]: Startup finished in 182ms.
Oct 08 19:17:04 compute-0 systemd[1]: Started User Manager for UID 1000.
Oct 08 19:17:04 compute-0 systemd[1]: Started Session 12 of User zuul.
Oct 08 19:17:04 compute-0 sshd-session[151457]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 08 19:17:04 compute-0 sudo[151497]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp -p container,openstack_edpm,system,storage,virt'
Oct 08 19:17:04 compute-0 sudo[151497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:17:05 compute-0 nova_compute[117514]: 2025-10-08 19:17:05.925 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:17:06 compute-0 nova_compute[117514]: 2025-10-08 19:17:06.590 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:17:09 compute-0 ovs-vsctl[151670]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Oct 08 19:17:09 compute-0 nova_compute[117514]: 2025-10-08 19:17:09.735 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:17:09 compute-0 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 151521 (sos)
Oct 08 19:17:09 compute-0 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Oct 08 19:17:09 compute-0 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Oct 08 19:17:10 compute-0 podman[151718]: 2025-10-08 19:17:10.075299672 +0000 UTC m=+0.091547157 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd)
Oct 08 19:17:10 compute-0 podman[151720]: 2025-10-08 19:17:10.075650702 +0000 UTC m=+0.097965662 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 08 19:17:10 compute-0 podman[151717]: 2025-10-08 19:17:10.081982224 +0000 UTC m=+0.104139239 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.openshift.expose-services=, distribution-scope=public, maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, vendor=Red Hat, Inc., managed_by=edpm_ansible, io.buildah.version=1.33.7, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git)
Oct 08 19:17:10 compute-0 virtqemud[117415]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Oct 08 19:17:10 compute-0 virtqemud[117415]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Oct 08 19:17:10 compute-0 virtqemud[117415]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Oct 08 19:17:10 compute-0 nova_compute[117514]: 2025-10-08 19:17:10.930 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:17:11 compute-0 kernel: block sr0: the capability attribute has been deprecated.
Oct 08 19:17:11 compute-0 nova_compute[117514]: 2025-10-08 19:17:11.592 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:17:11 compute-0 crontab[152157]: (root) LIST (root)
Oct 08 19:17:12 compute-0 podman[152210]: 2025-10-08 19:17:12.66075791 +0000 UTC m=+0.067910185 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct 08 19:17:12 compute-0 podman[152206]: 2025-10-08 19:17:12.706280781 +0000 UTC m=+0.116204526 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 08 19:17:12 compute-0 nova_compute[117514]: 2025-10-08 19:17:12.713 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:17:12 compute-0 nova_compute[117514]: 2025-10-08 19:17:12.715 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:17:12 compute-0 nova_compute[117514]: 2025-10-08 19:17:12.716 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 08 19:17:12 compute-0 nova_compute[117514]: 2025-10-08 19:17:12.716 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 08 19:17:12 compute-0 podman[152209]: 2025-10-08 19:17:12.725049231 +0000 UTC m=+0.134495962 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 08 19:17:12 compute-0 nova_compute[117514]: 2025-10-08 19:17:12.945 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 08 19:17:12 compute-0 nova_compute[117514]: 2025-10-08 19:17:12.945 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:17:12 compute-0 nova_compute[117514]: 2025-10-08 19:17:12.945 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 08 19:17:13 compute-0 nova_compute[117514]: 2025-10-08 19:17:13.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:17:13 compute-0 nova_compute[117514]: 2025-10-08 19:17:13.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:17:13 compute-0 nova_compute[117514]: 2025-10-08 19:17:13.745 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:17:13 compute-0 nova_compute[117514]: 2025-10-08 19:17:13.746 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:17:13 compute-0 nova_compute[117514]: 2025-10-08 19:17:13.746 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:17:13 compute-0 nova_compute[117514]: 2025-10-08 19:17:13.746 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 08 19:17:13 compute-0 nova_compute[117514]: 2025-10-08 19:17:13.879 2 WARNING nova.virt.libvirt.driver [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 08 19:17:13 compute-0 nova_compute[117514]: 2025-10-08 19:17:13.880 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5839MB free_disk=73.2657241821289GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 08 19:17:13 compute-0 nova_compute[117514]: 2025-10-08 19:17:13.880 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:17:13 compute-0 nova_compute[117514]: 2025-10-08 19:17:13.880 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:17:13 compute-0 nova_compute[117514]: 2025-10-08 19:17:13.951 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 08 19:17:13 compute-0 nova_compute[117514]: 2025-10-08 19:17:13.952 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 08 19:17:13 compute-0 nova_compute[117514]: 2025-10-08 19:17:13.973 2 DEBUG nova.compute.provider_tree [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 08 19:17:13 compute-0 nova_compute[117514]: 2025-10-08 19:17:13.987 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 08 19:17:13 compute-0 nova_compute[117514]: 2025-10-08 19:17:13.989 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 08 19:17:13 compute-0 nova_compute[117514]: 2025-10-08 19:17:13.989 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:17:14 compute-0 systemd[1]: Starting Hostname Service...
Oct 08 19:17:14 compute-0 systemd[1]: Started Hostname Service.
Oct 08 19:17:15 compute-0 nova_compute[117514]: 2025-10-08 19:17:15.946 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:17:15 compute-0 nova_compute[117514]: 2025-10-08 19:17:15.989 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:17:16 compute-0 nova_compute[117514]: 2025-10-08 19:17:16.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:17:17 compute-0 nova_compute[117514]: 2025-10-08 19:17:17.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:17:17 compute-0 nova_compute[117514]: 2025-10-08 19:17:17.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:17:18 compute-0 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Oct 08 19:17:18 compute-0 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Oct 08 19:17:18 compute-0 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Oct 08 19:17:18 compute-0 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Oct 08 19:17:18 compute-0 kernel: cfg80211: failed to load regulatory.db
Oct 08 19:17:18 compute-0 nova_compute[117514]: 2025-10-08 19:17:18.713 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:17:20 compute-0 ovs-appctl[153276]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Oct 08 19:17:20 compute-0 ovs-appctl[153290]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Oct 08 19:17:20 compute-0 ovs-appctl[153295]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Oct 08 19:17:20 compute-0 nova_compute[117514]: 2025-10-08 19:17:20.947 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:17:21 compute-0 nova_compute[117514]: 2025-10-08 19:17:21.596 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:17:24 compute-0 podman[154284]: 2025-10-08 19:17:24.892200246 +0000 UTC m=+0.068270127 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 08 19:17:25 compute-0 nova_compute[117514]: 2025-10-08 19:17:25.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:17:26 compute-0 nova_compute[117514]: 2025-10-08 19:17:26.597 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:17:28 compute-0 virtqemud[117415]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Oct 08 19:17:30 compute-0 nova_compute[117514]: 2025-10-08 19:17:30.975 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:17:31 compute-0 systemd[1]: Starting Time & Date Service...
Oct 08 19:17:31 compute-0 systemd[1]: Started Time & Date Service.
Oct 08 19:17:31 compute-0 nova_compute[117514]: 2025-10-08 19:17:31.599 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:17:34 compute-0 podman[154842]: 2025-10-08 19:17:34.518764273 +0000 UTC m=+0.093663537 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_id=edpm, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct 08 19:17:35 compute-0 nova_compute[117514]: 2025-10-08 19:17:35.979 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:17:36 compute-0 nova_compute[117514]: 2025-10-08 19:17:36.601 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:17:40 compute-0 podman[154866]: 2025-10-08 19:17:40.705336351 +0000 UTC m=+0.092544885 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 08 19:17:40 compute-0 podman[154865]: 2025-10-08 19:17:40.715901646 +0000 UTC m=+0.113561351 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Oct 08 19:17:40 compute-0 podman[154864]: 2025-10-08 19:17:40.740295138 +0000 UTC m=+0.140784754 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, release=1755695350, io.buildah.version=1.33.7, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container)
Oct 08 19:17:40 compute-0 nova_compute[117514]: 2025-10-08 19:17:40.984 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:17:41 compute-0 nova_compute[117514]: 2025-10-08 19:17:41.603 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:17:43 compute-0 podman[154927]: 2025-10-08 19:17:43.721645274 +0000 UTC m=+0.135514503 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 08 19:17:43 compute-0 podman[154929]: 2025-10-08 19:17:43.731336553 +0000 UTC m=+0.132879897 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Oct 08 19:17:43 compute-0 podman[154928]: 2025-10-08 19:17:43.749047443 +0000 UTC m=+0.159050911 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Oct 08 19:17:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:17:44.236 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:17:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:17:44.237 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:17:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:17:44.237 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:17:45 compute-0 nova_compute[117514]: 2025-10-08 19:17:45.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:17:46 compute-0 nova_compute[117514]: 2025-10-08 19:17:46.605 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:17:48 compute-0 sudo[151497]: pam_unix(sudo:session): session closed for user root
Oct 08 19:17:48 compute-0 sshd-session[151496]: Received disconnect from 192.168.122.10 port 53994:11: disconnected by user
Oct 08 19:17:48 compute-0 systemd[1]: session-12.scope: Deactivated successfully.
Oct 08 19:17:48 compute-0 sshd-session[151496]: Disconnected from user zuul 192.168.122.10 port 53994
Oct 08 19:17:48 compute-0 systemd[1]: session-12.scope: Consumed 1min 16.025s CPU time, 580.3M memory peak, read 172.9M from disk, written 17.6M to disk.
Oct 08 19:17:48 compute-0 sshd-session[151457]: pam_unix(sshd:session): session closed for user zuul
Oct 08 19:17:48 compute-0 systemd-logind[844]: Session 12 logged out. Waiting for processes to exit.
Oct 08 19:17:48 compute-0 systemd-logind[844]: Removed session 12.
Oct 08 19:17:48 compute-0 sshd-session[154992]: Accepted publickey for zuul from 192.168.122.10 port 34316 ssh2: ECDSA SHA256:i+73Mx2Y/ukt1b+huf+9w+ftZalnyybbDU6glTR0JfU
Oct 08 19:17:48 compute-0 systemd-logind[844]: New session 14 of user zuul.
Oct 08 19:17:48 compute-0 systemd[1]: Started Session 14 of User zuul.
Oct 08 19:17:48 compute-0 sshd-session[154992]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 08 19:17:49 compute-0 sudo[154996]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/cat /var/tmp/sos-osp/sosreport-compute-0-2025-10-08-mxwgoir.tar.xz
Oct 08 19:17:49 compute-0 sudo[154996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:17:49 compute-0 sudo[154996]: pam_unix(sudo:session): session closed for user root
Oct 08 19:17:49 compute-0 sshd-session[154995]: Received disconnect from 192.168.122.10 port 34316:11: disconnected by user
Oct 08 19:17:49 compute-0 sshd-session[154995]: Disconnected from user zuul 192.168.122.10 port 34316
Oct 08 19:17:49 compute-0 sshd-session[154992]: pam_unix(sshd:session): session closed for user zuul
Oct 08 19:17:49 compute-0 systemd[1]: session-14.scope: Deactivated successfully.
Oct 08 19:17:49 compute-0 systemd-logind[844]: Session 14 logged out. Waiting for processes to exit.
Oct 08 19:17:49 compute-0 systemd-logind[844]: Removed session 14.
Oct 08 19:17:49 compute-0 sshd-session[155021]: Accepted publickey for zuul from 192.168.122.10 port 34324 ssh2: ECDSA SHA256:i+73Mx2Y/ukt1b+huf+9w+ftZalnyybbDU6glTR0JfU
Oct 08 19:17:49 compute-0 systemd-logind[844]: New session 15 of user zuul.
Oct 08 19:17:49 compute-0 systemd[1]: Started Session 15 of User zuul.
Oct 08 19:17:49 compute-0 sshd-session[155021]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 08 19:17:49 compute-0 sudo[155025]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rm -rf /var/tmp/sos-osp
Oct 08 19:17:49 compute-0 sudo[155025]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:17:49 compute-0 sudo[155025]: pam_unix(sudo:session): session closed for user root
Oct 08 19:17:49 compute-0 sshd-session[155024]: Received disconnect from 192.168.122.10 port 34324:11: disconnected by user
Oct 08 19:17:49 compute-0 sshd-session[155024]: Disconnected from user zuul 192.168.122.10 port 34324
Oct 08 19:17:49 compute-0 sshd-session[155021]: pam_unix(sshd:session): session closed for user zuul
Oct 08 19:17:49 compute-0 systemd[1]: session-15.scope: Deactivated successfully.
Oct 08 19:17:49 compute-0 systemd-logind[844]: Session 15 logged out. Waiting for processes to exit.
Oct 08 19:17:49 compute-0 systemd-logind[844]: Removed session 15.
Oct 08 19:17:50 compute-0 nova_compute[117514]: 2025-10-08 19:17:50.992 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:17:51 compute-0 nova_compute[117514]: 2025-10-08 19:17:51.607 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:17:55 compute-0 podman[155051]: 2025-10-08 19:17:55.652448333 +0000 UTC m=+0.070412958 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 08 19:17:56 compute-0 nova_compute[117514]: 2025-10-08 19:17:56.004 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:17:56 compute-0 nova_compute[117514]: 2025-10-08 19:17:56.610 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:17:59 compute-0 systemd[1]: Stopping User Manager for UID 1000...
Oct 08 19:17:59 compute-0 systemd[151478]: Activating special unit Exit the Session...
Oct 08 19:17:59 compute-0 systemd[151478]: Stopped target Main User Target.
Oct 08 19:17:59 compute-0 systemd[151478]: Stopped target Basic System.
Oct 08 19:17:59 compute-0 systemd[151478]: Stopped target Paths.
Oct 08 19:17:59 compute-0 systemd[151478]: Stopped target Sockets.
Oct 08 19:17:59 compute-0 systemd[151478]: Stopped target Timers.
Oct 08 19:17:59 compute-0 systemd[151478]: Stopped Mark boot as successful after the user session has run 2 minutes.
Oct 08 19:17:59 compute-0 systemd[151478]: Stopped Daily Cleanup of User's Temporary Directories.
Oct 08 19:17:59 compute-0 systemd[151478]: Closed D-Bus User Message Bus Socket.
Oct 08 19:17:59 compute-0 systemd[151478]: Stopped Create User's Volatile Files and Directories.
Oct 08 19:17:59 compute-0 systemd[151478]: Removed slice User Application Slice.
Oct 08 19:17:59 compute-0 systemd[151478]: Reached target Shutdown.
Oct 08 19:17:59 compute-0 systemd[151478]: Finished Exit the Session.
Oct 08 19:17:59 compute-0 systemd[151478]: Reached target Exit the Session.
Oct 08 19:17:59 compute-0 systemd[1]: user@1000.service: Deactivated successfully.
Oct 08 19:17:59 compute-0 systemd[1]: Stopped User Manager for UID 1000.
Oct 08 19:17:59 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/1000...
Oct 08 19:17:59 compute-0 systemd[1]: run-user-1000.mount: Deactivated successfully.
Oct 08 19:17:59 compute-0 systemd[1]: user-runtime-dir@1000.service: Deactivated successfully.
Oct 08 19:17:59 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/1000.
Oct 08 19:17:59 compute-0 systemd[1]: Removed slice User Slice of UID 1000.
Oct 08 19:17:59 compute-0 systemd[1]: user-1000.slice: Consumed 1min 16.666s CPU time, 585.7M memory peak, read 172.9M from disk, written 17.6M to disk.
Oct 08 19:18:01 compute-0 nova_compute[117514]: 2025-10-08 19:18:01.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:18:01 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Oct 08 19:18:01 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct 08 19:18:01 compute-0 nova_compute[117514]: 2025-10-08 19:18:01.613 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:18:04 compute-0 podman[155081]: 2025-10-08 19:18:04.685246087 +0000 UTC m=+0.106192319 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct 08 19:18:06 compute-0 nova_compute[117514]: 2025-10-08 19:18:06.012 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:18:06 compute-0 nova_compute[117514]: 2025-10-08 19:18:06.658 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:18:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:18:08.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:18:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:18:08.247 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:18:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:18:08.247 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:18:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:18:08.247 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:18:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:18:08.247 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:18:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:18:08.248 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:18:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:18:08.248 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:18:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:18:08.248 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:18:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:18:08.248 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:18:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:18:08.249 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:18:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:18:08.249 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:18:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:18:08.249 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:18:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:18:08.249 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:18:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:18:08.250 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:18:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:18:08.250 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:18:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:18:08.250 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:18:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:18:08.250 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:18:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:18:08.250 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:18:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:18:08.251 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:18:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:18:08.251 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:18:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:18:08.251 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:18:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:18:08.251 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:18:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:18:08.251 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:18:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:18:08.252 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:18:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:18:08.252 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:18:10 compute-0 nova_compute[117514]: 2025-10-08 19:18:10.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:18:11 compute-0 nova_compute[117514]: 2025-10-08 19:18:11.017 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:18:11 compute-0 podman[155101]: 2025-10-08 19:18:11.645643603 +0000 UTC m=+0.063506999 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.buildah.version=1.33.7, version=9.6, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., architecture=x86_64, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct 08 19:18:11 compute-0 nova_compute[117514]: 2025-10-08 19:18:11.664 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:18:11 compute-0 podman[155102]: 2025-10-08 19:18:11.680011263 +0000 UTC m=+0.094111961 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.schema-version=1.0)
Oct 08 19:18:11 compute-0 podman[155103]: 2025-10-08 19:18:11.685434439 +0000 UTC m=+0.092039001 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 08 19:18:12 compute-0 nova_compute[117514]: 2025-10-08 19:18:12.713 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:18:12 compute-0 nova_compute[117514]: 2025-10-08 19:18:12.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:18:12 compute-0 nova_compute[117514]: 2025-10-08 19:18:12.717 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 08 19:18:12 compute-0 nova_compute[117514]: 2025-10-08 19:18:12.717 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 08 19:18:12 compute-0 nova_compute[117514]: 2025-10-08 19:18:12.743 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 08 19:18:12 compute-0 nova_compute[117514]: 2025-10-08 19:18:12.743 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:18:12 compute-0 nova_compute[117514]: 2025-10-08 19:18:12.744 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 08 19:18:14 compute-0 podman[155170]: 2025-10-08 19:18:14.677103462 +0000 UTC m=+0.077699168 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct 08 19:18:14 compute-0 podman[155168]: 2025-10-08 19:18:14.677972217 +0000 UTC m=+0.090203428 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=iscsid, container_name=iscsid)
Oct 08 19:18:14 compute-0 nova_compute[117514]: 2025-10-08 19:18:14.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:18:14 compute-0 podman[155169]: 2025-10-08 19:18:14.716971179 +0000 UTC m=+0.124649169 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct 08 19:18:14 compute-0 nova_compute[117514]: 2025-10-08 19:18:14.748 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:18:14 compute-0 nova_compute[117514]: 2025-10-08 19:18:14.748 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:18:14 compute-0 nova_compute[117514]: 2025-10-08 19:18:14.749 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:18:14 compute-0 nova_compute[117514]: 2025-10-08 19:18:14.749 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 08 19:18:14 compute-0 nova_compute[117514]: 2025-10-08 19:18:14.967 2 WARNING nova.virt.libvirt.driver [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 08 19:18:14 compute-0 nova_compute[117514]: 2025-10-08 19:18:14.968 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5988MB free_disk=73.40881729125977GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 08 19:18:14 compute-0 nova_compute[117514]: 2025-10-08 19:18:14.969 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:18:14 compute-0 nova_compute[117514]: 2025-10-08 19:18:14.969 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:18:15 compute-0 nova_compute[117514]: 2025-10-08 19:18:15.060 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 08 19:18:15 compute-0 nova_compute[117514]: 2025-10-08 19:18:15.061 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 08 19:18:15 compute-0 nova_compute[117514]: 2025-10-08 19:18:15.082 2 DEBUG nova.compute.provider_tree [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 08 19:18:15 compute-0 nova_compute[117514]: 2025-10-08 19:18:15.096 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 08 19:18:15 compute-0 nova_compute[117514]: 2025-10-08 19:18:15.098 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 08 19:18:15 compute-0 nova_compute[117514]: 2025-10-08 19:18:15.098 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:18:16 compute-0 nova_compute[117514]: 2025-10-08 19:18:16.020 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:18:16 compute-0 nova_compute[117514]: 2025-10-08 19:18:16.099 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:18:16 compute-0 nova_compute[117514]: 2025-10-08 19:18:16.100 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:18:16 compute-0 nova_compute[117514]: 2025-10-08 19:18:16.666 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:18:17 compute-0 nova_compute[117514]: 2025-10-08 19:18:17.718 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:18:18 compute-0 nova_compute[117514]: 2025-10-08 19:18:18.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:18:21 compute-0 nova_compute[117514]: 2025-10-08 19:18:21.023 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:18:21 compute-0 nova_compute[117514]: 2025-10-08 19:18:21.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:18:26 compute-0 nova_compute[117514]: 2025-10-08 19:18:26.027 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:18:26 compute-0 podman[155233]: 2025-10-08 19:18:26.658196619 +0000 UTC m=+0.073437796 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 08 19:18:26 compute-0 nova_compute[117514]: 2025-10-08 19:18:26.672 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:18:28 compute-0 unix_chkpwd[155259]: password check failed for user (root)
Oct 08 19:18:28 compute-0 sshd-session[155257]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.20  user=root
Oct 08 19:18:29 compute-0 sshd-session[155257]: Failed password for root from 193.46.255.20 port 35386 ssh2
Oct 08 19:18:30 compute-0 unix_chkpwd[155260]: password check failed for user (root)
Oct 08 19:18:31 compute-0 nova_compute[117514]: 2025-10-08 19:18:31.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:18:31 compute-0 nova_compute[117514]: 2025-10-08 19:18:31.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:18:32 compute-0 sshd-session[155257]: Failed password for root from 193.46.255.20 port 35386 ssh2
Oct 08 19:18:33 compute-0 unix_chkpwd[155261]: password check failed for user (root)
Oct 08 19:18:35 compute-0 sshd-session[155257]: Failed password for root from 193.46.255.20 port 35386 ssh2
Oct 08 19:18:35 compute-0 sshd-session[155257]: Received disconnect from 193.46.255.20 port 35386:11:  [preauth]
Oct 08 19:18:35 compute-0 sshd-session[155257]: Disconnected from authenticating user root 193.46.255.20 port 35386 [preauth]
Oct 08 19:18:35 compute-0 sshd-session[155257]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.20  user=root
Oct 08 19:18:35 compute-0 podman[155262]: 2025-10-08 19:18:35.673259601 +0000 UTC m=+0.086478501 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct 08 19:18:36 compute-0 nova_compute[117514]: 2025-10-08 19:18:36.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:18:36 compute-0 unix_chkpwd[155285]: password check failed for user (root)
Oct 08 19:18:36 compute-0 sshd-session[155283]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.20  user=root
Oct 08 19:18:36 compute-0 nova_compute[117514]: 2025-10-08 19:18:36.676 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:18:38 compute-0 sshd-session[155283]: Failed password for root from 193.46.255.20 port 13784 ssh2
Oct 08 19:18:39 compute-0 unix_chkpwd[155286]: password check failed for user (root)
Oct 08 19:18:40 compute-0 sshd-session[155283]: Failed password for root from 193.46.255.20 port 13784 ssh2
Oct 08 19:18:41 compute-0 nova_compute[117514]: 2025-10-08 19:18:41.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:18:41 compute-0 unix_chkpwd[155287]: password check failed for user (root)
Oct 08 19:18:41 compute-0 nova_compute[117514]: 2025-10-08 19:18:41.680 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:18:42 compute-0 podman[155290]: 2025-10-08 19:18:42.64138494 +0000 UTC m=+0.050302749 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 08 19:18:42 compute-0 podman[155288]: 2025-10-08 19:18:42.64556646 +0000 UTC m=+0.061250614 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9-minimal, io.openshift.expose-services=, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, architecture=x86_64, build-date=2025-08-20T13:12:41, vcs-type=git, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers)
Oct 08 19:18:42 compute-0 podman[155289]: 2025-10-08 19:18:42.655316111 +0000 UTC m=+0.067691690 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct 08 19:18:42 compute-0 sshd-session[155283]: Failed password for root from 193.46.255.20 port 13784 ssh2
Oct 08 19:18:43 compute-0 sshd-session[155283]: Received disconnect from 193.46.255.20 port 13784:11:  [preauth]
Oct 08 19:18:43 compute-0 sshd-session[155283]: Disconnected from authenticating user root 193.46.255.20 port 13784 [preauth]
Oct 08 19:18:43 compute-0 sshd-session[155283]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.20  user=root
Oct 08 19:18:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:18:44.238 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:18:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:18:44.238 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:18:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:18:44.238 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:18:44 compute-0 unix_chkpwd[155356]: password check failed for user (root)
Oct 08 19:18:44 compute-0 sshd-session[155354]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.20  user=root
Oct 08 19:18:45 compute-0 podman[155357]: 2025-10-08 19:18:45.658186288 +0000 UTC m=+0.073183388 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 08 19:18:45 compute-0 podman[155359]: 2025-10-08 19:18:45.715910569 +0000 UTC m=+0.114983441 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Oct 08 19:18:45 compute-0 podman[155358]: 2025-10-08 19:18:45.740849286 +0000 UTC m=+0.145863239 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct 08 19:18:46 compute-0 nova_compute[117514]: 2025-10-08 19:18:46.037 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:18:46 compute-0 sshd-session[155354]: Failed password for root from 193.46.255.20 port 62690 ssh2
Oct 08 19:18:46 compute-0 nova_compute[117514]: 2025-10-08 19:18:46.681 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:18:47 compute-0 unix_chkpwd[155420]: password check failed for user (root)
Oct 08 19:18:49 compute-0 sshd-session[155354]: Failed password for root from 193.46.255.20 port 62690 ssh2
Oct 08 19:18:49 compute-0 unix_chkpwd[155421]: password check failed for user (root)
Oct 08 19:18:51 compute-0 nova_compute[117514]: 2025-10-08 19:18:51.041 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:18:51 compute-0 sshd-session[155354]: Failed password for root from 193.46.255.20 port 62690 ssh2
Oct 08 19:18:51 compute-0 nova_compute[117514]: 2025-10-08 19:18:51.684 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:18:52 compute-0 sshd-session[155354]: Received disconnect from 193.46.255.20 port 62690:11:  [preauth]
Oct 08 19:18:52 compute-0 sshd-session[155354]: Disconnected from authenticating user root 193.46.255.20 port 62690 [preauth]
Oct 08 19:18:52 compute-0 sshd-session[155354]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.20  user=root
Oct 08 19:18:56 compute-0 nova_compute[117514]: 2025-10-08 19:18:56.045 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:18:56 compute-0 nova_compute[117514]: 2025-10-08 19:18:56.686 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:18:57 compute-0 podman[155422]: 2025-10-08 19:18:57.637736581 +0000 UTC m=+0.062801939 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 08 19:19:01 compute-0 nova_compute[117514]: 2025-10-08 19:19:01.049 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:19:01 compute-0 nova_compute[117514]: 2025-10-08 19:19:01.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:19:06 compute-0 nova_compute[117514]: 2025-10-08 19:19:06.054 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:19:06 compute-0 nova_compute[117514]: 2025-10-08 19:19:06.734 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:19:06 compute-0 podman[155446]: 2025-10-08 19:19:06.736614327 +0000 UTC m=+0.150165895 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 08 19:19:11 compute-0 nova_compute[117514]: 2025-10-08 19:19:11.060 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:19:11 compute-0 nova_compute[117514]: 2025-10-08 19:19:11.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:19:12 compute-0 nova_compute[117514]: 2025-10-08 19:19:12.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:19:13 compute-0 podman[155468]: 2025-10-08 19:19:13.692589966 +0000 UTC m=+0.094170752 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct 08 19:19:13 compute-0 podman[155467]: 2025-10-08 19:19:13.697329243 +0000 UTC m=+0.102685098 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct 08 19:19:13 compute-0 nova_compute[117514]: 2025-10-08 19:19:13.713 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:19:13 compute-0 podman[155466]: 2025-10-08 19:19:13.715218588 +0000 UTC m=+0.128801050 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, release=1755695350, vcs-type=git, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct 08 19:19:13 compute-0 nova_compute[117514]: 2025-10-08 19:19:13.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:19:13 compute-0 nova_compute[117514]: 2025-10-08 19:19:13.717 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 08 19:19:14 compute-0 nova_compute[117514]: 2025-10-08 19:19:14.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:19:14 compute-0 nova_compute[117514]: 2025-10-08 19:19:14.718 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 08 19:19:14 compute-0 nova_compute[117514]: 2025-10-08 19:19:14.718 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 08 19:19:14 compute-0 nova_compute[117514]: 2025-10-08 19:19:14.733 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 08 19:19:14 compute-0 nova_compute[117514]: 2025-10-08 19:19:14.734 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:19:14 compute-0 nova_compute[117514]: 2025-10-08 19:19:14.759 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:19:14 compute-0 nova_compute[117514]: 2025-10-08 19:19:14.760 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:19:14 compute-0 nova_compute[117514]: 2025-10-08 19:19:14.760 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:19:14 compute-0 nova_compute[117514]: 2025-10-08 19:19:14.760 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 08 19:19:14 compute-0 nova_compute[117514]: 2025-10-08 19:19:14.988 2 WARNING nova.virt.libvirt.driver [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 08 19:19:14 compute-0 nova_compute[117514]: 2025-10-08 19:19:14.989 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6052MB free_disk=73.40883255004883GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 08 19:19:14 compute-0 nova_compute[117514]: 2025-10-08 19:19:14.989 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:19:14 compute-0 nova_compute[117514]: 2025-10-08 19:19:14.990 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:19:15 compute-0 nova_compute[117514]: 2025-10-08 19:19:15.045 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 08 19:19:15 compute-0 nova_compute[117514]: 2025-10-08 19:19:15.046 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 08 19:19:15 compute-0 nova_compute[117514]: 2025-10-08 19:19:15.067 2 DEBUG nova.compute.provider_tree [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 08 19:19:15 compute-0 nova_compute[117514]: 2025-10-08 19:19:15.080 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 08 19:19:15 compute-0 nova_compute[117514]: 2025-10-08 19:19:15.082 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 08 19:19:15 compute-0 nova_compute[117514]: 2025-10-08 19:19:15.082 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.093s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:19:16 compute-0 nova_compute[117514]: 2025-10-08 19:19:16.064 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:19:16 compute-0 podman[155532]: 2025-10-08 19:19:16.671760019 +0000 UTC m=+0.087052727 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, org.label-schema.build-date=20251001, tcib_managed=true, container_name=iscsid, managed_by=edpm_ansible)
Oct 08 19:19:16 compute-0 podman[155534]: 2025-10-08 19:19:16.672336156 +0000 UTC m=+0.075552087 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct 08 19:19:16 compute-0 podman[155533]: 2025-10-08 19:19:16.711575715 +0000 UTC m=+0.118503072 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Oct 08 19:19:16 compute-0 nova_compute[117514]: 2025-10-08 19:19:16.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:19:17 compute-0 nova_compute[117514]: 2025-10-08 19:19:17.065 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:19:17 compute-0 nova_compute[117514]: 2025-10-08 19:19:17.066 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:19:17 compute-0 nova_compute[117514]: 2025-10-08 19:19:17.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:19:18 compute-0 nova_compute[117514]: 2025-10-08 19:19:18.713 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:19:18 compute-0 nova_compute[117514]: 2025-10-08 19:19:18.740 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:19:21 compute-0 nova_compute[117514]: 2025-10-08 19:19:21.068 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:19:21 compute-0 nova_compute[117514]: 2025-10-08 19:19:21.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:19:26 compute-0 nova_compute[117514]: 2025-10-08 19:19:26.071 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:19:26 compute-0 nova_compute[117514]: 2025-10-08 19:19:26.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:19:28 compute-0 podman[155594]: 2025-10-08 19:19:28.656991313 +0000 UTC m=+0.070862620 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 08 19:19:31 compute-0 nova_compute[117514]: 2025-10-08 19:19:31.075 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:19:31 compute-0 nova_compute[117514]: 2025-10-08 19:19:31.746 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:19:36 compute-0 nova_compute[117514]: 2025-10-08 19:19:36.078 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:19:36 compute-0 nova_compute[117514]: 2025-10-08 19:19:36.751 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:19:37 compute-0 podman[155619]: 2025-10-08 19:19:37.678113388 +0000 UTC m=+0.095794478 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 08 19:19:41 compute-0 nova_compute[117514]: 2025-10-08 19:19:41.106 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:19:41 compute-0 nova_compute[117514]: 2025-10-08 19:19:41.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:19:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:19:44.239 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:19:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:19:44.239 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:19:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:19:44.240 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:19:44 compute-0 podman[155640]: 2025-10-08 19:19:44.6538235 +0000 UTC m=+0.075318728 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Oct 08 19:19:44 compute-0 podman[155639]: 2025-10-08 19:19:44.654285553 +0000 UTC m=+0.079054845 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, version=9.6, container_name=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers)
Oct 08 19:19:44 compute-0 podman[155641]: 2025-10-08 19:19:44.675430842 +0000 UTC m=+0.087954002 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct 08 19:19:46 compute-0 nova_compute[117514]: 2025-10-08 19:19:46.109 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:19:46 compute-0 nova_compute[117514]: 2025-10-08 19:19:46.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:19:47 compute-0 podman[155700]: 2025-10-08 19:19:47.66498048 +0000 UTC m=+0.073652630 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid)
Oct 08 19:19:47 compute-0 podman[155702]: 2025-10-08 19:19:47.678028166 +0000 UTC m=+0.072926790 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Oct 08 19:19:47 compute-0 podman[155701]: 2025-10-08 19:19:47.78417096 +0000 UTC m=+0.189536545 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 08 19:19:51 compute-0 nova_compute[117514]: 2025-10-08 19:19:51.164 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:19:51 compute-0 nova_compute[117514]: 2025-10-08 19:19:51.754 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:19:56 compute-0 nova_compute[117514]: 2025-10-08 19:19:56.220 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:19:56 compute-0 nova_compute[117514]: 2025-10-08 19:19:56.756 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:19:59 compute-0 podman[155763]: 2025-10-08 19:19:59.657328366 +0000 UTC m=+0.064477526 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 08 19:20:01 compute-0 nova_compute[117514]: 2025-10-08 19:20:01.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:20:01 compute-0 nova_compute[117514]: 2025-10-08 19:20:01.759 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:20:06 compute-0 nova_compute[117514]: 2025-10-08 19:20:06.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:20:06 compute-0 nova_compute[117514]: 2025-10-08 19:20:06.762 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:20:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:20:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:20:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:20:08.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:20:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:20:08.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:20:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:20:08.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:20:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:20:08.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:20:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:20:08.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:20:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:20:08.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:20:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:20:08.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:20:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:20:08.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:20:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:20:08.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:20:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:20:08.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:20:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:20:08.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:20:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:20:08.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:20:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:20:08.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:20:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:20:08.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:20:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:20:08.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:20:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:20:08.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:20:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:20:08.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:20:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:20:08.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:20:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:20:08.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:20:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:20:08.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:20:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:20:08.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:20:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:20:08.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:20:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:20:08.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:20:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:20:08.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:20:08 compute-0 systemd[1]: Starting system activity accounting tool...
Oct 08 19:20:08 compute-0 systemd[1]: sysstat-collect.service: Deactivated successfully.
Oct 08 19:20:08 compute-0 systemd[1]: Finished system activity accounting tool.
Oct 08 19:20:08 compute-0 podman[155787]: 2025-10-08 19:20:08.65549773 +0000 UTC m=+0.077291965 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Oct 08 19:20:11 compute-0 nova_compute[117514]: 2025-10-08 19:20:11.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:20:11 compute-0 nova_compute[117514]: 2025-10-08 19:20:11.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:20:12 compute-0 nova_compute[117514]: 2025-10-08 19:20:12.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:20:13 compute-0 nova_compute[117514]: 2025-10-08 19:20:13.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:20:13 compute-0 nova_compute[117514]: 2025-10-08 19:20:13.718 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 08 19:20:14 compute-0 nova_compute[117514]: 2025-10-08 19:20:14.712 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:20:15 compute-0 podman[155809]: 2025-10-08 19:20:15.677659099 +0000 UTC m=+0.084523853 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 08 19:20:15 compute-0 podman[155808]: 2025-10-08 19:20:15.677789963 +0000 UTC m=+0.088761805 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, managed_by=edpm_ansible, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., container_name=openstack_network_exporter, version=9.6, config_id=edpm, distribution-scope=public)
Oct 08 19:20:15 compute-0 podman[155810]: 2025-10-08 19:20:15.678765891 +0000 UTC m=+0.080258121 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 08 19:20:15 compute-0 nova_compute[117514]: 2025-10-08 19:20:15.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:20:15 compute-0 nova_compute[117514]: 2025-10-08 19:20:15.745 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:20:15 compute-0 nova_compute[117514]: 2025-10-08 19:20:15.745 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:20:15 compute-0 nova_compute[117514]: 2025-10-08 19:20:15.745 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:20:15 compute-0 nova_compute[117514]: 2025-10-08 19:20:15.746 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 08 19:20:16 compute-0 nova_compute[117514]: 2025-10-08 19:20:16.000 2 WARNING nova.virt.libvirt.driver [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 08 19:20:16 compute-0 nova_compute[117514]: 2025-10-08 19:20:16.001 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6071MB free_disk=73.40880966186523GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 08 19:20:16 compute-0 nova_compute[117514]: 2025-10-08 19:20:16.002 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:20:16 compute-0 nova_compute[117514]: 2025-10-08 19:20:16.002 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:20:16 compute-0 nova_compute[117514]: 2025-10-08 19:20:16.066 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 08 19:20:16 compute-0 nova_compute[117514]: 2025-10-08 19:20:16.067 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 08 19:20:16 compute-0 nova_compute[117514]: 2025-10-08 19:20:16.088 2 DEBUG nova.compute.provider_tree [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 08 19:20:16 compute-0 nova_compute[117514]: 2025-10-08 19:20:16.105 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 08 19:20:16 compute-0 nova_compute[117514]: 2025-10-08 19:20:16.107 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 08 19:20:16 compute-0 nova_compute[117514]: 2025-10-08 19:20:16.108 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:20:16 compute-0 nova_compute[117514]: 2025-10-08 19:20:16.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:20:16 compute-0 nova_compute[117514]: 2025-10-08 19:20:16.815 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:20:17 compute-0 nova_compute[117514]: 2025-10-08 19:20:17.108 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:20:17 compute-0 nova_compute[117514]: 2025-10-08 19:20:17.109 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 08 19:20:17 compute-0 nova_compute[117514]: 2025-10-08 19:20:17.109 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 08 19:20:17 compute-0 nova_compute[117514]: 2025-10-08 19:20:17.122 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 08 19:20:17 compute-0 nova_compute[117514]: 2025-10-08 19:20:17.123 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:20:17 compute-0 nova_compute[117514]: 2025-10-08 19:20:17.124 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:20:18 compute-0 podman[155870]: 2025-10-08 19:20:18.674842418 +0000 UTC m=+0.089259969 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team)
Oct 08 19:20:18 compute-0 podman[155872]: 2025-10-08 19:20:18.676254539 +0000 UTC m=+0.079884560 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent)
Oct 08 19:20:18 compute-0 nova_compute[117514]: 2025-10-08 19:20:18.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:20:18 compute-0 podman[155871]: 2025-10-08 19:20:18.721346326 +0000 UTC m=+0.130915708 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team)
Oct 08 19:20:19 compute-0 nova_compute[117514]: 2025-10-08 19:20:19.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:20:21 compute-0 nova_compute[117514]: 2025-10-08 19:20:21.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:20:21 compute-0 nova_compute[117514]: 2025-10-08 19:20:21.817 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:20:26 compute-0 nova_compute[117514]: 2025-10-08 19:20:26.243 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:20:26 compute-0 nova_compute[117514]: 2025-10-08 19:20:26.819 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:20:30 compute-0 podman[155931]: 2025-10-08 19:20:30.663037704 +0000 UTC m=+0.076615015 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 08 19:20:31 compute-0 nova_compute[117514]: 2025-10-08 19:20:31.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:20:31 compute-0 nova_compute[117514]: 2025-10-08 19:20:31.822 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:20:36 compute-0 nova_compute[117514]: 2025-10-08 19:20:36.252 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:20:36 compute-0 nova_compute[117514]: 2025-10-08 19:20:36.830 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:20:39 compute-0 podman[155955]: 2025-10-08 19:20:39.67824362 +0000 UTC m=+0.090950738 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 08 19:20:41 compute-0 nova_compute[117514]: 2025-10-08 19:20:41.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:20:41 compute-0 nova_compute[117514]: 2025-10-08 19:20:41.831 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:20:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:20:44.240 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:20:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:20:44.240 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:20:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:20:44.240 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:20:46 compute-0 nova_compute[117514]: 2025-10-08 19:20:46.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:20:46 compute-0 podman[155976]: 2025-10-08 19:20:46.65563922 +0000 UTC m=+0.069838470 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 08 19:20:46 compute-0 podman[155975]: 2025-10-08 19:20:46.664025182 +0000 UTC m=+0.083238946 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=edpm, release=1755695350, container_name=openstack_network_exporter, architecture=x86_64, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct 08 19:20:46 compute-0 podman[155977]: 2025-10-08 19:20:46.667297686 +0000 UTC m=+0.066529005 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 08 19:20:46 compute-0 nova_compute[117514]: 2025-10-08 19:20:46.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:20:49 compute-0 podman[156038]: 2025-10-08 19:20:49.654893088 +0000 UTC m=+0.079531439 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct 08 19:20:49 compute-0 podman[156040]: 2025-10-08 19:20:49.664800143 +0000 UTC m=+0.074269918 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct 08 19:20:49 compute-0 podman[156039]: 2025-10-08 19:20:49.728466545 +0000 UTC m=+0.138951339 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 08 19:20:51 compute-0 nova_compute[117514]: 2025-10-08 19:20:51.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:20:51 compute-0 nova_compute[117514]: 2025-10-08 19:20:51.835 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:20:56 compute-0 nova_compute[117514]: 2025-10-08 19:20:56.317 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:20:56 compute-0 nova_compute[117514]: 2025-10-08 19:20:56.837 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:21:01 compute-0 nova_compute[117514]: 2025-10-08 19:21:01.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:21:01 compute-0 podman[156100]: 2025-10-08 19:21:01.670172006 +0000 UTC m=+0.090460694 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 08 19:21:01 compute-0 nova_compute[117514]: 2025-10-08 19:21:01.841 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:21:01 compute-0 nova_compute[117514]: 2025-10-08 19:21:01.933 2 DEBUG oslo_concurrency.processutils [None req-726d2452-5976-48e6-a261-8201e24bb8bf 4109eb10f1504d00848780f1ed22af42 0776a2a010754884a7b224f3b08ef53b - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 08 19:21:01 compute-0 nova_compute[117514]: 2025-10-08 19:21:01.957 2 DEBUG oslo_concurrency.processutils [None req-726d2452-5976-48e6-a261-8201e24bb8bf 4109eb10f1504d00848780f1ed22af42 0776a2a010754884a7b224f3b08ef53b - - default default] CMD "env LANG=C uptime" returned: 0 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 08 19:21:06 compute-0 nova_compute[117514]: 2025-10-08 19:21:06.326 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:21:06 compute-0 nova_compute[117514]: 2025-10-08 19:21:06.842 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:21:07 compute-0 nova_compute[117514]: 2025-10-08 19:21:07.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:21:07 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:21:07.517 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a6:75:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '5e:14:dd:63:55:2a'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 08 19:21:07 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:21:07.518 28643 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 08 19:21:10 compute-0 podman[156125]: 2025-10-08 19:21:10.658190987 +0000 UTC m=+0.085357227 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible)
Oct 08 19:21:11 compute-0 nova_compute[117514]: 2025-10-08 19:21:11.330 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:21:11 compute-0 nova_compute[117514]: 2025-10-08 19:21:11.843 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:21:13 compute-0 nova_compute[117514]: 2025-10-08 19:21:13.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:21:14 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:21:14.522 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f81f7a-64d8-418a-a74c-b879bd6deb83, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 08 19:21:14 compute-0 nova_compute[117514]: 2025-10-08 19:21:14.713 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:21:15 compute-0 nova_compute[117514]: 2025-10-08 19:21:15.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:21:15 compute-0 nova_compute[117514]: 2025-10-08 19:21:15.717 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 08 19:21:15 compute-0 nova_compute[117514]: 2025-10-08 19:21:15.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:21:15 compute-0 nova_compute[117514]: 2025-10-08 19:21:15.756 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:21:15 compute-0 nova_compute[117514]: 2025-10-08 19:21:15.757 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:21:15 compute-0 nova_compute[117514]: 2025-10-08 19:21:15.757 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:21:15 compute-0 nova_compute[117514]: 2025-10-08 19:21:15.758 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 08 19:21:15 compute-0 nova_compute[117514]: 2025-10-08 19:21:15.994 2 WARNING nova.virt.libvirt.driver [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 08 19:21:15 compute-0 nova_compute[117514]: 2025-10-08 19:21:15.995 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6079MB free_disk=73.40882873535156GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 08 19:21:15 compute-0 nova_compute[117514]: 2025-10-08 19:21:15.995 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:21:15 compute-0 nova_compute[117514]: 2025-10-08 19:21:15.996 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:21:16 compute-0 nova_compute[117514]: 2025-10-08 19:21:16.300 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 08 19:21:16 compute-0 nova_compute[117514]: 2025-10-08 19:21:16.301 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 08 19:21:16 compute-0 nova_compute[117514]: 2025-10-08 19:21:16.335 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:21:16 compute-0 nova_compute[117514]: 2025-10-08 19:21:16.370 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Refreshing inventories for resource provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Oct 08 19:21:16 compute-0 nova_compute[117514]: 2025-10-08 19:21:16.434 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Updating ProviderTree inventory for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Oct 08 19:21:16 compute-0 nova_compute[117514]: 2025-10-08 19:21:16.435 2 DEBUG nova.compute.provider_tree [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Updating inventory in ProviderTree for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Oct 08 19:21:16 compute-0 nova_compute[117514]: 2025-10-08 19:21:16.451 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Refreshing aggregate associations for resource provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Oct 08 19:21:16 compute-0 nova_compute[117514]: 2025-10-08 19:21:16.504 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Refreshing trait associations for resource provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349, traits: COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_FDC,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE2,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_MMX,HW_CPU_X86_AESNI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_ACCELERATORS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE42,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_AVX,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_FMA3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SVM,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NODE,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSSE3,HW_CPU_X86_AVX2,HW_CPU_X86_SSE,HW_CPU_X86_CLMUL,HW_CPU_X86_F16C,HW_CPU_X86_AMD_SVM,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_BMI,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_PCNET _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Oct 08 19:21:16 compute-0 nova_compute[117514]: 2025-10-08 19:21:16.533 2 DEBUG nova.compute.provider_tree [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 08 19:21:16 compute-0 nova_compute[117514]: 2025-10-08 19:21:16.567 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 08 19:21:16 compute-0 nova_compute[117514]: 2025-10-08 19:21:16.569 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 08 19:21:16 compute-0 nova_compute[117514]: 2025-10-08 19:21:16.570 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.574s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:21:16 compute-0 nova_compute[117514]: 2025-10-08 19:21:16.871 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:21:17 compute-0 podman[156147]: 2025-10-08 19:21:17.66417527 +0000 UTC m=+0.087676604 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., architecture=x86_64, distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, version=9.6, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41)
Oct 08 19:21:17 compute-0 podman[156148]: 2025-10-08 19:21:17.675213357 +0000 UTC m=+0.061566301 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251001, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Oct 08 19:21:17 compute-0 podman[156149]: 2025-10-08 19:21:17.707067084 +0000 UTC m=+0.082978618 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 08 19:21:18 compute-0 nova_compute[117514]: 2025-10-08 19:21:18.570 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:21:18 compute-0 nova_compute[117514]: 2025-10-08 19:21:18.570 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 08 19:21:18 compute-0 nova_compute[117514]: 2025-10-08 19:21:18.570 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 08 19:21:18 compute-0 nova_compute[117514]: 2025-10-08 19:21:18.595 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 08 19:21:18 compute-0 nova_compute[117514]: 2025-10-08 19:21:18.596 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:21:18 compute-0 nova_compute[117514]: 2025-10-08 19:21:18.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:21:18 compute-0 nova_compute[117514]: 2025-10-08 19:21:18.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:21:20 compute-0 podman[156210]: 2025-10-08 19:21:20.682965179 +0000 UTC m=+0.090380641 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=iscsid, io.buildah.version=1.41.3)
Oct 08 19:21:20 compute-0 podman[156212]: 2025-10-08 19:21:20.686507091 +0000 UTC m=+0.073515956 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 08 19:21:20 compute-0 podman[156211]: 2025-10-08 19:21:20.710752579 +0000 UTC m=+0.117721339 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 08 19:21:20 compute-0 nova_compute[117514]: 2025-10-08 19:21:20.712 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:21:20 compute-0 nova_compute[117514]: 2025-10-08 19:21:20.728 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:21:20 compute-0 nova_compute[117514]: 2025-10-08 19:21:20.728 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Oct 08 19:21:20 compute-0 nova_compute[117514]: 2025-10-08 19:21:20.741 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Oct 08 19:21:21 compute-0 nova_compute[117514]: 2025-10-08 19:21:21.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:21:21 compute-0 nova_compute[117514]: 2025-10-08 19:21:21.731 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:21:21 compute-0 nova_compute[117514]: 2025-10-08 19:21:21.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:21:22 compute-0 nova_compute[117514]: 2025-10-08 19:21:22.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:21:26 compute-0 nova_compute[117514]: 2025-10-08 19:21:26.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:21:26 compute-0 nova_compute[117514]: 2025-10-08 19:21:26.875 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:21:27 compute-0 nova_compute[117514]: 2025-10-08 19:21:27.734 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:21:27 compute-0 nova_compute[117514]: 2025-10-08 19:21:27.735 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Oct 08 19:21:31 compute-0 nova_compute[117514]: 2025-10-08 19:21:31.377 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:21:31 compute-0 nova_compute[117514]: 2025-10-08 19:21:31.878 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:21:32 compute-0 podman[156270]: 2025-10-08 19:21:32.6312924 +0000 UTC m=+0.053182341 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 08 19:21:34 compute-0 nova_compute[117514]: 2025-10-08 19:21:34.131 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:21:36 compute-0 nova_compute[117514]: 2025-10-08 19:21:36.380 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:21:36 compute-0 nova_compute[117514]: 2025-10-08 19:21:36.881 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:21:41 compute-0 nova_compute[117514]: 2025-10-08 19:21:41.385 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:21:41 compute-0 podman[156295]: 2025-10-08 19:21:41.666985364 +0000 UTC m=+0.078466479 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, managed_by=edpm_ansible)
Oct 08 19:21:41 compute-0 nova_compute[117514]: 2025-10-08 19:21:41.882 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:21:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:21:44.242 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:21:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:21:44.242 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:21:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:21:44.243 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:21:46 compute-0 nova_compute[117514]: 2025-10-08 19:21:46.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:21:46 compute-0 nova_compute[117514]: 2025-10-08 19:21:46.934 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:21:48 compute-0 podman[156317]: 2025-10-08 19:21:48.661300582 +0000 UTC m=+0.078406127 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=multipathd)
Oct 08 19:21:48 compute-0 podman[156318]: 2025-10-08 19:21:48.679601799 +0000 UTC m=+0.082822604 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 08 19:21:48 compute-0 podman[156316]: 2025-10-08 19:21:48.695839206 +0000 UTC m=+0.114585968 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct 08 19:21:51 compute-0 nova_compute[117514]: 2025-10-08 19:21:51.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:21:51 compute-0 podman[156380]: 2025-10-08 19:21:51.688456625 +0000 UTC m=+0.089289261 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct 08 19:21:51 compute-0 podman[156378]: 2025-10-08 19:21:51.692160011 +0000 UTC m=+0.112413365 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 08 19:21:51 compute-0 podman[156379]: 2025-10-08 19:21:51.704074924 +0000 UTC m=+0.118342906 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 08 19:21:51 compute-0 nova_compute[117514]: 2025-10-08 19:21:51.965 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:21:56 compute-0 nova_compute[117514]: 2025-10-08 19:21:56.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:21:57 compute-0 nova_compute[117514]: 2025-10-08 19:21:57.041 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:22:01 compute-0 nova_compute[117514]: 2025-10-08 19:22:01.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:22:02 compute-0 nova_compute[117514]: 2025-10-08 19:22:02.085 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:22:03 compute-0 podman[156440]: 2025-10-08 19:22:03.659518857 +0000 UTC m=+0.079614022 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 08 19:22:06 compute-0 nova_compute[117514]: 2025-10-08 19:22:06.407 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:22:07 compute-0 nova_compute[117514]: 2025-10-08 19:22:07.086 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:22:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:22:08.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:22:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:22:08.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:22:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:22:08.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:22:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:22:08.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:22:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:22:08.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:22:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:22:08.247 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:22:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:22:08.247 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:22:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:22:08.247 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:22:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:22:08.248 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:22:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:22:08.248 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:22:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:22:08.248 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:22:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:22:08.248 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:22:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:22:08.249 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:22:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:22:08.249 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:22:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:22:08.249 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:22:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:22:08.249 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:22:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:22:08.250 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:22:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:22:08.250 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:22:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:22:08.250 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:22:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:22:08.250 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:22:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:22:08.251 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:22:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:22:08.251 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:22:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:22:08.251 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:22:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:22:08.251 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:22:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:22:08.252 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct 08 19:22:11 compute-0 nova_compute[117514]: 2025-10-08 19:22:11.410 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:22:12 compute-0 nova_compute[117514]: 2025-10-08 19:22:12.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:22:12 compute-0 podman[156465]: 2025-10-08 19:22:12.668979658 +0000 UTC m=+0.093848662 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 08 19:22:14 compute-0 nova_compute[117514]: 2025-10-08 19:22:14.736 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:22:14 compute-0 nova_compute[117514]: 2025-10-08 19:22:14.737 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:22:15 compute-0 nova_compute[117514]: 2025-10-08 19:22:15.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:22:15 compute-0 nova_compute[117514]: 2025-10-08 19:22:15.753 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:22:15 compute-0 nova_compute[117514]: 2025-10-08 19:22:15.754 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:22:15 compute-0 nova_compute[117514]: 2025-10-08 19:22:15.754 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:22:15 compute-0 nova_compute[117514]: 2025-10-08 19:22:15.754 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 08 19:22:15 compute-0 nova_compute[117514]: 2025-10-08 19:22:15.993 2 WARNING nova.virt.libvirt.driver [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 08 19:22:15 compute-0 nova_compute[117514]: 2025-10-08 19:22:15.995 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6069MB free_disk=73.40876770019531GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 08 19:22:15 compute-0 nova_compute[117514]: 2025-10-08 19:22:15.995 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:22:15 compute-0 nova_compute[117514]: 2025-10-08 19:22:15.996 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:22:16 compute-0 nova_compute[117514]: 2025-10-08 19:22:16.070 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 08 19:22:16 compute-0 nova_compute[117514]: 2025-10-08 19:22:16.071 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 08 19:22:16 compute-0 nova_compute[117514]: 2025-10-08 19:22:16.106 2 DEBUG nova.compute.provider_tree [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 08 19:22:16 compute-0 nova_compute[117514]: 2025-10-08 19:22:16.122 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 08 19:22:16 compute-0 nova_compute[117514]: 2025-10-08 19:22:16.125 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 08 19:22:16 compute-0 nova_compute[117514]: 2025-10-08 19:22:16.125 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:22:16 compute-0 nova_compute[117514]: 2025-10-08 19:22:16.414 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:22:17 compute-0 nova_compute[117514]: 2025-10-08 19:22:17.147 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:22:18 compute-0 nova_compute[117514]: 2025-10-08 19:22:18.126 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:22:18 compute-0 nova_compute[117514]: 2025-10-08 19:22:18.127 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 08 19:22:19 compute-0 podman[156485]: 2025-10-08 19:22:19.670322078 +0000 UTC m=+0.081499076 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., release=1755695350, name=ubi9-minimal, vcs-type=git, io.openshift.tags=minimal rhel9, config_id=edpm, version=9.6, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct 08 19:22:19 compute-0 podman[156487]: 2025-10-08 19:22:19.681551971 +0000 UTC m=+0.080415825 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct 08 19:22:19 compute-0 podman[156486]: 2025-10-08 19:22:19.685094053 +0000 UTC m=+0.089113825 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct 08 19:22:19 compute-0 nova_compute[117514]: 2025-10-08 19:22:19.718 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:22:19 compute-0 nova_compute[117514]: 2025-10-08 19:22:19.718 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 08 19:22:19 compute-0 nova_compute[117514]: 2025-10-08 19:22:19.718 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 08 19:22:19 compute-0 nova_compute[117514]: 2025-10-08 19:22:19.741 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 08 19:22:19 compute-0 nova_compute[117514]: 2025-10-08 19:22:19.741 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:22:19 compute-0 nova_compute[117514]: 2025-10-08 19:22:19.742 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:22:20 compute-0 nova_compute[117514]: 2025-10-08 19:22:20.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:22:21 compute-0 nova_compute[117514]: 2025-10-08 19:22:21.415 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:22:22 compute-0 nova_compute[117514]: 2025-10-08 19:22:22.147 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:22:22 compute-0 podman[156542]: 2025-10-08 19:22:22.672479609 +0000 UTC m=+0.083874224 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 08 19:22:22 compute-0 podman[156544]: 2025-10-08 19:22:22.672931062 +0000 UTC m=+0.079468068 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent)
Oct 08 19:22:22 compute-0 podman[156543]: 2025-10-08 19:22:22.714023824 +0000 UTC m=+0.125826841 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true)
Oct 08 19:22:23 compute-0 nova_compute[117514]: 2025-10-08 19:22:23.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:22:26 compute-0 nova_compute[117514]: 2025-10-08 19:22:26.419 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:22:27 compute-0 nova_compute[117514]: 2025-10-08 19:22:27.191 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:22:31 compute-0 nova_compute[117514]: 2025-10-08 19:22:31.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:22:32 compute-0 nova_compute[117514]: 2025-10-08 19:22:32.192 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:22:34 compute-0 podman[156608]: 2025-10-08 19:22:34.652120299 +0000 UTC m=+0.068415889 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 08 19:22:36 compute-0 nova_compute[117514]: 2025-10-08 19:22:36.423 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:22:37 compute-0 nova_compute[117514]: 2025-10-08 19:22:37.195 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:22:41 compute-0 nova_compute[117514]: 2025-10-08 19:22:41.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:22:42 compute-0 nova_compute[117514]: 2025-10-08 19:22:42.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:22:43 compute-0 podman[156633]: 2025-10-08 19:22:43.700023764 +0000 UTC m=+0.113170817 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct 08 19:22:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:22:44.244 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:22:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:22:44.244 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:22:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:22:44.244 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:22:46 compute-0 nova_compute[117514]: 2025-10-08 19:22:46.429 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:22:47 compute-0 nova_compute[117514]: 2025-10-08 19:22:47.199 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:22:50 compute-0 podman[156657]: 2025-10-08 19:22:50.635969733 +0000 UTC m=+0.047371474 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 08 19:22:50 compute-0 podman[156655]: 2025-10-08 19:22:50.671069533 +0000 UTC m=+0.087567641 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, config_id=edpm, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=minimal rhel9, version=9.6, io.buildah.version=1.33.7, name=ubi9-minimal, release=1755695350, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct 08 19:22:50 compute-0 podman[156656]: 2025-10-08 19:22:50.682187963 +0000 UTC m=+0.086644754 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, container_name=multipathd)
Oct 08 19:22:51 compute-0 nova_compute[117514]: 2025-10-08 19:22:51.433 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:22:52 compute-0 nova_compute[117514]: 2025-10-08 19:22:52.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:22:53 compute-0 podman[156715]: 2025-10-08 19:22:53.648632596 +0000 UTC m=+0.071560720 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_id=iscsid, container_name=iscsid, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 08 19:22:53 compute-0 podman[156717]: 2025-10-08 19:22:53.6894248 +0000 UTC m=+0.087855669 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct 08 19:22:53 compute-0 podman[156716]: 2025-10-08 19:22:53.69500029 +0000 UTC m=+0.104020564 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 08 19:22:56 compute-0 nova_compute[117514]: 2025-10-08 19:22:56.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:22:57 compute-0 nova_compute[117514]: 2025-10-08 19:22:57.203 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:23:01 compute-0 nova_compute[117514]: 2025-10-08 19:23:01.439 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:23:02 compute-0 nova_compute[117514]: 2025-10-08 19:23:02.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:23:05 compute-0 podman[156779]: 2025-10-08 19:23:05.644317677 +0000 UTC m=+0.065252588 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 08 19:23:06 compute-0 nova_compute[117514]: 2025-10-08 19:23:06.444 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:23:07 compute-0 nova_compute[117514]: 2025-10-08 19:23:07.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:23:11 compute-0 nova_compute[117514]: 2025-10-08 19:23:11.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:23:12 compute-0 nova_compute[117514]: 2025-10-08 19:23:12.246 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:23:12 compute-0 sshd-session[156804]: Accepted publickey for zuul from 192.168.122.10 port 34846 ssh2: ECDSA SHA256:i+73Mx2Y/ukt1b+huf+9w+ftZalnyybbDU6glTR0JfU
Oct 08 19:23:12 compute-0 systemd[1]: Created slice User Slice of UID 1000.
Oct 08 19:23:12 compute-0 systemd[1]: Starting User Runtime Directory /run/user/1000...
Oct 08 19:23:12 compute-0 systemd-logind[844]: New session 16 of user zuul.
Oct 08 19:23:12 compute-0 systemd[1]: Finished User Runtime Directory /run/user/1000.
Oct 08 19:23:12 compute-0 systemd[1]: Starting User Manager for UID 1000...
Oct 08 19:23:12 compute-0 systemd[156808]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 08 19:23:12 compute-0 systemd[156808]: Queued start job for default target Main User Target.
Oct 08 19:23:12 compute-0 systemd[156808]: Created slice User Application Slice.
Oct 08 19:23:12 compute-0 systemd[156808]: Started Mark boot as successful after the user session has run 2 minutes.
Oct 08 19:23:12 compute-0 systemd[156808]: Started Daily Cleanup of User's Temporary Directories.
Oct 08 19:23:12 compute-0 systemd[156808]: Reached target Paths.
Oct 08 19:23:12 compute-0 systemd[156808]: Reached target Timers.
Oct 08 19:23:12 compute-0 systemd[156808]: Starting D-Bus User Message Bus Socket...
Oct 08 19:23:12 compute-0 systemd[156808]: Starting Create User's Volatile Files and Directories...
Oct 08 19:23:12 compute-0 systemd[156808]: Listening on D-Bus User Message Bus Socket.
Oct 08 19:23:12 compute-0 systemd[156808]: Reached target Sockets.
Oct 08 19:23:12 compute-0 systemd[156808]: Finished Create User's Volatile Files and Directories.
Oct 08 19:23:12 compute-0 systemd[156808]: Reached target Basic System.
Oct 08 19:23:12 compute-0 systemd[156808]: Reached target Main User Target.
Oct 08 19:23:12 compute-0 systemd[156808]: Startup finished in 173ms.
Oct 08 19:23:12 compute-0 systemd[1]: Started User Manager for UID 1000.
Oct 08 19:23:12 compute-0 systemd[1]: Started Session 16 of User zuul.
Oct 08 19:23:12 compute-0 sshd-session[156804]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 08 19:23:12 compute-0 sudo[156825]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp -p container,openstack_edpm,system,storage,virt'
Oct 08 19:23:12 compute-0 sudo[156825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 08 19:23:14 compute-0 podman[156859]: 2025-10-08 19:23:14.098099148 +0000 UTC m=+0.111297123 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible)
Oct 08 19:23:15 compute-0 nova_compute[117514]: 2025-10-08 19:23:15.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:23:16 compute-0 nova_compute[117514]: 2025-10-08 19:23:16.450 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:23:16 compute-0 nova_compute[117514]: 2025-10-08 19:23:16.713 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:23:16 compute-0 nova_compute[117514]: 2025-10-08 19:23:16.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:23:16 compute-0 nova_compute[117514]: 2025-10-08 19:23:16.750 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:23:16 compute-0 nova_compute[117514]: 2025-10-08 19:23:16.750 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:23:16 compute-0 nova_compute[117514]: 2025-10-08 19:23:16.751 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:23:16 compute-0 nova_compute[117514]: 2025-10-08 19:23:16.751 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 08 19:23:16 compute-0 nova_compute[117514]: 2025-10-08 19:23:16.937 2 WARNING nova.virt.libvirt.driver [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 08 19:23:16 compute-0 nova_compute[117514]: 2025-10-08 19:23:16.939 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5960MB free_disk=73.40849304199219GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 08 19:23:16 compute-0 nova_compute[117514]: 2025-10-08 19:23:16.939 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 08 19:23:16 compute-0 nova_compute[117514]: 2025-10-08 19:23:16.939 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 08 19:23:16 compute-0 nova_compute[117514]: 2025-10-08 19:23:16.997 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 08 19:23:16 compute-0 nova_compute[117514]: 2025-10-08 19:23:16.997 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 08 19:23:17 compute-0 nova_compute[117514]: 2025-10-08 19:23:17.019 2 DEBUG nova.compute.provider_tree [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 08 19:23:17 compute-0 nova_compute[117514]: 2025-10-08 19:23:17.031 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 08 19:23:17 compute-0 nova_compute[117514]: 2025-10-08 19:23:17.032 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 08 19:23:17 compute-0 nova_compute[117514]: 2025-10-08 19:23:17.033 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.094s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 08 19:23:17 compute-0 nova_compute[117514]: 2025-10-08 19:23:17.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:23:17 compute-0 ovs-vsctl[157017]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Oct 08 19:23:18 compute-0 virtqemud[117415]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Oct 08 19:23:18 compute-0 virtqemud[117415]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Oct 08 19:23:18 compute-0 virtqemud[117415]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Oct 08 19:23:19 compute-0 nova_compute[117514]: 2025-10-08 19:23:19.033 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:23:19 compute-0 nova_compute[117514]: 2025-10-08 19:23:19.035 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 08 19:23:19 compute-0 nova_compute[117514]: 2025-10-08 19:23:19.719 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:23:19 compute-0 nova_compute[117514]: 2025-10-08 19:23:19.719 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 08 19:23:19 compute-0 nova_compute[117514]: 2025-10-08 19:23:19.719 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 08 19:23:19 compute-0 nova_compute[117514]: 2025-10-08 19:23:19.879 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 08 19:23:19 compute-0 nova_compute[117514]: 2025-10-08 19:23:19.880 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:23:20 compute-0 crontab[157431]: (root) LIST (root)
Oct 08 19:23:20 compute-0 nova_compute[117514]: 2025-10-08 19:23:20.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:23:21 compute-0 nova_compute[117514]: 2025-10-08 19:23:21.451 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:23:21 compute-0 podman[157523]: 2025-10-08 19:23:21.661374526 +0000 UTC m=+0.071389645 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, config_id=edpm)
Oct 08 19:23:21 compute-0 podman[157525]: 2025-10-08 19:23:21.685894281 +0000 UTC m=+0.100363868 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 08 19:23:21 compute-0 podman[157526]: 2025-10-08 19:23:21.690850994 +0000 UTC m=+0.102382867 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct 08 19:23:21 compute-0 nova_compute[117514]: 2025-10-08 19:23:21.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:23:22 compute-0 nova_compute[117514]: 2025-10-08 19:23:22.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 08 19:23:22 compute-0 systemd[1]: Starting Hostname Service...
Oct 08 19:23:22 compute-0 systemd[1]: Started Hostname Service.
Oct 08 19:23:22 compute-0 nova_compute[117514]: 2025-10-08 19:23:22.712 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 08 19:23:23 compute-0 podman[157664]: 2025-10-08 19:23:23.807235779 +0000 UTC m=+0.107008760 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, container_name=iscsid, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct 08 19:23:23 compute-0 podman[157689]: 2025-10-08 19:23:23.923073072 +0000 UTC m=+0.065663500 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct 08 19:23:23 compute-0 podman[157687]: 2025-10-08 19:23:23.986744985 +0000 UTC m=+0.136145669 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 08 19:23:24 compute-0 nova_compute[117514]: 2025-10-08 19:23:24.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
